How to start multiple processes in Bash

Bash

Bash Problem Overview


I want to start 100 processes in bash, but the for statement doesn't seems to like the & symbol and I'm getting a syntax error, here is what I have so far:

echo "Spawning 100 processes"
for i in {1..100}
do
    ./my_script.py &
done

EDIT: I was copypasting this code, that's why the & char was illegal.

Bash Solutions


Solution 1 - Bash

echo "Spawning 100 processes"
for i in {1..100} ;
do
    ( ./my_script & )
; done

Solution 2 - Bash

With GNU Parallel you can do:

echo "Spawning 100 processes"
parallel -j0 ./my_script.py ::: {1..100}

Or to avoid the argument 1 through 100:

parallel -j0 -N0 ./my_script.py ::: {1..100}

Without -j0 it will spawn one process per CPU thread.

Watch the intro videos for more details: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1

Solution 3 - Bash

The only reason I can think of why this wouldn't work is if you were really using some other shell, like /bin/sh.

Do you have #!/bin/bash at the top of your file? If yes, please change it to #!/bin/bash -x (to turn on tracing, or xtrace as it's called in the manual page) and paste the relevant output into your question, along with the exact syntax error that is occurring. If no, that might be your problem. ;-)

The other possibility I can think of is if you have ^M characters (DOS line endings) in your file, which might result in errors such as the following (depending on which line they are on, if they are scattered around, or depending on if the script starts with a #! line):

-bash: ./myscript.sh: /bin/bash^M: bad interpreter: No such file or directory
'/myscript.sh: line 2: syntax error near unexpected token `do

This page has a nice perl snippet that can remove them, as follows (which I have modified slightly so it will work in the unlikely case that you have a stray ^M in the middle of a line):

perl -pi -e 's/\r//g' myscript.sh

Solution 4 - Bash

As others have noted, your snippet is valid code.

Not sure if this is what you need ... but you can fork twice:

( ( /complete/path/my_script.py & ) & )

This will let your script run even if the shell it was launced from is destroyed.

Solution 5 - Bash

In this example we have 2 parallel processes, based on different arguments (numbers)

my_array=(1 2 3 4) ; printf '%s\n' "${my_array[@]}" | parallel -j 2 "echo {} &"

You can replace my_array by something that generates output for the echo command (e.g: find *tif -printf "%f\n")

If can also used nohup to prevent the processed to be terminated when you close a ssh session: nohup sh -c 'find *tif -printf "%f\n" | parallel -j 2 echo {}' &

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionigorgueView Question on Stackoverflow
Solution 1 - BashdatenwolfView Answer on Stackoverflow
Solution 2 - BashOle TangeView Answer on Stackoverflow
Solution 3 - BashmpontilloView Answer on Stackoverflow
Solution 4 - BashstefgosselinView Answer on Stackoverflow
Solution 5 - BashJorge MendesView Answer on Stackoverflow