I have the following function:
def copy_file(source_file, target_dir):
pass
Now I would like to use multiprocessing
to execute this function at once:
p = Pool(12)
p.map(lambda x: copy_file(x,target_dir), file_list)
The problem is, lambda's can't be pickled, so this fails. What is the most neat (pythonic) way to fix this?
Use a function object:
To run your
Pool.map
:For Python2.7+ or Python3, you could use functools.partial:
From this answer, pathos let's you run your lambda
p.map(lambda x: copy_file(x,target_dir), file_list)
directly, saving all the workarounds / hacksQuestion is a bit old but if you are still use Python 2 my answer can be useful.
Trick is to use part of pathos project: multiprocess fork of multiprocessing. It get rid of annoying limitation of original multiprocess.
Installation:
pip install multiprocess
Usage: