Passing multiple parameters to pool.map() function in Python

PythonMultiprocessingPoolMap Function

Python Problem Overview


I need some way to use a function within pool.map() that accepts more than one parameter. As per my understanding, the target function of pool.map() can only have one iterable as a parameter but is there a way that I can pass other parameters in as well? In this case, I need to pass in a few configuration variables, like my Lock() and logging information to the target function.

I have tried to do some research and I think that I may be able to use partial functions to get it to work? However I don't fully understand how these work. Any help would be greatly appreciated! Here is a simple example of what I want to do:

def target(items, lock):
    for item in items:
        # Do cool stuff
        if (... some condition here ...):
            lock.acquire()
            # Write to stdout or logfile, etc.
            lock.release()

def main():
    iterable = [1, 2, 3, 4, 5]
    pool = multiprocessing.Pool()
    pool.map(target(PASS PARAMS HERE), iterable)
    pool.close()
    pool.join()

Python Solutions


Solution 1 - Python

You can use functools.partial for this (as you suspected):

from functools import partial

def target(lock, iterable_item):
    for item in iterable_item:
        # Do cool stuff
        if (... some condition here ...):
            lock.acquire()
            # Write to stdout or logfile, etc.
            lock.release()

def main():
    iterable = [1, 2, 3, 4, 5]
    pool = multiprocessing.Pool()
    l = multiprocessing.Lock()
    func = partial(target, l)
    pool.map(func, iterable)
    pool.close()
    pool.join()

Example:

def f(a, b, c):
    print("{} {} {}".format(a, b, c))

def main():
    iterable = [1, 2, 3, 4, 5]
    pool = multiprocessing.Pool()
    a = "hi"
    b = "there"
    func = partial(f, a, b)
    pool.map(func, iterable)
    pool.close()
    pool.join()

if __name__ == "__main__":
    main()

Output:

hi there 1
hi there 2
hi there 3
hi there 4
hi there 5

Solution 2 - Python

You could use a map function that allows multiple arguments, as does the fork of multiprocessing found in pathos.

>>> from pathos.multiprocessing import ProcessingPool as Pool
>>> 
>>> def add_and_subtract(x,y):
...   return x+y, x-y
... 
>>> res = Pool().map(add_and_subtract, range(0,20,2), range(-5,5,1))
>>> res
[(-5, 5), (-2, 6), (1, 7), (4, 8), (7, 9), (10, 10), (13, 11), (16, 12), (19, 13), (22, 14)]
>>> Pool().map(add_and_subtract, *zip(*res))
[(0, -10), (4, -8), (8, -6), (12, -4), (16, -2), (20, 0), (24, 2), (28, 4), (32, 6), (36, 8)]

pathos enables you to easily nest hierarchical parallel maps with multiple inputs, so we can extend our example to demonstrate that.

>>> from pathos.multiprocessing import ThreadingPool as TPool
>>> 
>>> res = TPool().amap(add_and_subtract, *zip(*Pool().map(add_and_subtract, range(0,20,2), range(-5,5,1))))
>>> res.get()
[(0, -10), (4, -8), (8, -6), (12, -4), (16, -2), (20, 0), (24, 2), (28, 4), (32, 6), (36, 8)]

Even more fun, is to build a nested function that we can pass into the Pool. This is possible because pathos uses dill, which can serialize almost anything in python.

>>> def build_fun_things(f, g):
...   def do_fun_things(x, y):
...     return f(x,y), g(x,y)
...   return do_fun_things
... 
>>> def add(x,y):
...   return x+y
... 
>>> def sub(x,y):
...   return x-y
... 
>>> neato = build_fun_things(add, sub)
>>> 
>>> res = TPool().imap(neato, *zip(*Pool().map(neato, range(0,20,2), range(-5,5,1))))
>>> list(res)
[(0, -10), (4, -8), (8, -6), (12, -4), (16, -2), (20, 0), (24, 2), (28, 4), (32, 6), (36, 8)]

If you are not able to go outside of the standard library, however, you will have to do this another way. Your best bet in that case is to use multiprocessing.starmap as seen here: https://stackoverflow.com/questions/5442910/python-multiprocessing-pool-map-for-multiple-arguments (noted by @Roberto in the comments on the OP's post)

Get pathos here: https://github.com/uqfoundation

Solution 3 - Python

In case you don't have access to functools.partial, you could use a wrapper function for this, as well.

def target(lock):
    def wrapped_func(items):
        for item in items:
            # Do cool stuff
            if (... some condition here ...):
                lock.acquire()
                # Write to stdout or logfile, etc.
                lock.release()
    return wrapped_func

def main():
    iterable = [1, 2, 3, 4, 5]
    pool = multiprocessing.Pool()
    lck = multiprocessing.Lock()
    pool.map(target(lck), iterable)
    pool.close()
    pool.join()

This makes target() into a function that accepts a lock (or whatever parameters you want to give), and it will return a function that only takes in an iterable as input, but can still use all your other parameters. That's what is ultimately passed in to pool.map(), which then should execute with no problems.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionDJMcCarthy12View Question on Stackoverflow
Solution 1 - PythondanoView Answer on Stackoverflow
Solution 2 - PythonMike McKernsView Answer on Stackoverflow
Solution 3 - PythonTheSoundDefenseView Answer on Stackoverflow