Python multiprocessing's Pool process limit

PythonMultiprocessingCpu Cores

Python Problem Overview


In using the Pool object from the multiprocessing module, is the number of processes limited by the number of CPU cores? E.g. if I have 4 cores, even if I create a Pool with 8 processes, only 4 will be running at one time?

Python Solutions


Solution 1 - Python

You can ask for as many processes as you like. Any limit that may exist will be imposed by your operating system, not by multiprocessing. For example,

 p = multiprocessing.Pool(1000000)

is likely to suffer an ugly death on any machine. I'm trying it on my box as I type this, and the OS is grinding my disk to dust swapping out RAM madly - finally killed it after it had created about 3000 processes ;-)

As to how many will run "at one time", Python has no say in that. It depends on:

  1. How many your hardware is capable of running simultaneously; and,
  2. How your operating system decides to give hardware resources to all the processes on your machine currently running.

For CPU-bound tasks, it doesn't make sense to create more Pool processes than you have cores to run them on. If you're trying to use your machine for other things too, then you should create fewer processes than cores.

For I/O-bound tasks, it may make sense to create a quite a few more Pool processes than cores, since the processes will probably spend most their time blocked (waiting for I/O to complete).

Solution 2 - Python

Yes. Theoretically there is no limit on processes you can create, but an insane amount of processes started at once will cause death to the system because of the running out of memory. Note that processes occupy a much larger footprint than threads as they don't use shared space between them but use an individual space for each process.

so the best programming practice is to use semaphore restricted to the number of processors of your system. likely

pool = multiprocessing.Semaphore(4) # no of cpus of your system.

If you are not aware of the number of cores of your system or if you want to use the code in many systems, a generic code like the below will do...

pool = multiprocessing.Semaphore(multiprocessing.cpu_count()) 
#this will detect the number of cores in your system and creates a semaphore with that  value.  

P.S. But it is good to use number of cores-1 always.

Hope this helps :)

Solution 3 - Python

While there is no limit you can set, if you are looking to understand a convenient number to use for CPU bound processes (which I suspect you are looking for here), you can run the following:

>>> import multiprocessing
>>> multiprocessing.cpu_count()
1

Some good notes on limitations (especially in linux) are noted in the answer here:

Solution 4 - Python

That is correct. If you have 4 cores then 4 processes can be running at once. Remember that you have system stuff that needs to go on, and it would be nice for you to define the process number to be number_of_cores - 1. This is a preference and not mandatory. For each process that you create there is overhead, so you are actually using more memory to do this. But if RAM isn't a problem then go for it. If you are running Cuda or some other GPU based library then you have a different paradigm, but that's for another question.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
Questionrottentomato56View Question on Stackoverflow
Solution 1 - PythonTim PetersView Answer on Stackoverflow
Solution 2 - PythonSravan K GhantasalaView Answer on Stackoverflow
Solution 3 - PythonSteve D.View Answer on Stackoverflow
Solution 4 - PythonBack2BasicsView Answer on Stackoverflow