Run multiple curl commands in parallel

BashShellCurl

Bash Problem Overview


I have the following shell script. The issue is that I want to run the transactions parallel/concurrently without waiting for one request to finish to go to the next request. For example if I make 20 requests, I want them to be executed at the same time.

for ((request=1;request<=20;request++))
do
    for ((x=1;x<=20;x++))
    do
        time curl -X POST --header "http://localhost:5000/example"
    done
done

Any guide?

Bash Solutions


Solution 1 - Bash

You can use xargs with -P option to run any command in parallel:

seq 1 200 | xargs -n1 -P10  curl "http://localhost:5000/example"

This will run curl command 200 times with max 10 jobs in parallel.

Solution 2 - Bash

Using xargs -P option, you can run any command in parallel:

xargs -I % -P 8 curl -X POST --header "http://localhost:5000/example" \
< <(printf '%s\n' {1..400})

This will run give curl command 400 times with max 8 jobs in parallel.

Solution 3 - Bash

This is an addition to @saeed's answer.

I faced an issue where it made unnecessary requests to the following hosts

0.0.0.1, 0.0.0.2 .... 0.0.0.N

The reason was the command xargs was passing arguments to the curl command. In order to prevent the passing of arguments, we can specify which character to replace the argument by using the -I flag.

So we will use it as,

 ... xargs -I '$' command ...

Now, xargs will replace the argument wherever the $ literal is found. And if it is not found the argument is not passed. So using this the final command will be.

seq 1 200 | xargs -I $ -n1 -P10  curl "http://localhost:5000/example"

Note: If you are using $ in your command try to replace it with some other character that is not being used.

Solution 4 - Bash

Update 2020:

Curl can now fetch several websites in parallel:

curl --parallel --parallel-immediate --parallel-max 3 --config websites.txt

websites.txt file:

url = "website1.com"
url = "website2.com"
url = "website3.com"

Solution 5 - Bash

Adding to @saeed's answer, I created a generic function that utilises function arguments to fire commands for a total of N times in M jobs at a parallel

function conc(){
    cmd=("${@:3}")
    seq 1 "$1" | xargs -n1 -P"$2" "${cmd[@]}"
}
$ conc N M cmd
$ conc 10 2 curl --location --request GET 'http://google.com/'

This will fire 10 curl commands at a max parallelism of two each.

Adding this function to the bash_profile.rc makes it easier. Gist

Solution 6 - Bash

Add “wait” at the end, and background them.

for ((request=1;request<=20;request++))
do
    for ((x=1;x<=20;x++))
    do
        time curl -X POST --header "http://localhost:5000/example" &
    done
done

wait

They will all output to the same stdout, but you can redirect the result of the time (and stdout and stderr) to a named file:

time curl -X POST --header "http://localhost:5000/example" > output.${x}.${request}.out 2>1 &

Solution 7 - Bash

Wanted to share my example how I utilised parallel xargs with curl.

The pros from using xargs that u can specify how many threads will be used to parallelise curl rather than using curl with "&" that will schedule all let's say 10000 curls simultaneously.

Hope it will be helpful to smdy:

#!/bin/sh

url=/any-url
currentDate=$(date +%Y-%m-%d)
payload='{"field1":"value1", "field2":{},"timestamp":"'$currentDate'"}'
threadCount=10

cat $1 | \
xargs -P $threadCount -I {} curl -sw 'url= %{url_effective}, http_status_code = %{http_code},time_total = %{time_total} seconds \n' -H "Content-Type: application/json" -H "Accept: application/json" -X POST $url --max-time 60 -d $payload

.csv file has 1 value per row that will be inserted in json payload

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
Questionuser5836023View Question on Stackoverflow
Solution 1 - BashSaeed MohtashamView Answer on Stackoverflow
Solution 2 - BashanubhavaView Answer on Stackoverflow
Solution 3 - BashSubesh BhandariView Answer on Stackoverflow
Solution 4 - BashSergey GeronView Answer on Stackoverflow
Solution 5 - BashisopropylcyanideView Answer on Stackoverflow
Solution 6 - BashGaétan RYCKEBOERView Answer on Stackoverflow
Solution 7 - BashDzmitry HubinView Answer on Stackoverflow