How to resume interrupted download automatically in curl?

LinuxCurlResume Download

Linux Problem Overview


I'm working with curl in Linux. I'm downloading a part of a file in ftp server (using the -r option), but my connection is not good, it always interrupts. I want to write a script which resume download when I'm connected again.

I've used this command, but it's not working:

until curl -r 666-9999 -C - --retry 999 -o "path/to/file" "ftp:/path/to/remote/file"; do :; done

Linux Solutions


Solution 1 - Linux

curl -L -O your_url

This will download the file.

Now let's say your connection is interrupted;

curl -L -O -C - your_url

This will continue downloading from the last byte downloaded

From the manpage:

>Use "-C -" to tell curl to automatically find out where/how to resume the transfer. It then uses the given output/input files to figure that out.

Solution 2 - Linux

wget has been built specifically for this use case. From the man page:

Wget has been designed for robustness over slow or unstable network connections;
if a download fails due to a network problem, it will keep retrying until the
whole file has been retrieved.  If the server supports regetting, it will
instruct the server to continue the download from where it left off.

wget is available for almost all Linux distributions - it probably is already installed on yours. Just use wget to download the file, it will re-establish the network connection until the file is completely transferred.

Solution 3 - Linux

You can check the exit code in a while loop and resume until the exit code indicates that the download has succeeded:

export ec=18; while [ $ec -eq 18 ]; do /usr/bin/curl -O -C - "http://www.example.com/a-big-archive.zip"; export ec=$?; done

The example is taken from http://ilovesymposia.com/2013/04/11/automatically-resume-interrupted-downloads-in-osx-with-curl/

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionBili the bigView Question on Stackoverflow
Solution 1 - LinuxJithinView Answer on Stackoverflow
Solution 2 - LinuxsjaenschView Answer on Stackoverflow
Solution 3 - LinuxClaudio FloreaniView Answer on Stackoverflow