Tip: Download Accelerator for Linux

August 09, 2007
How do you download files in Linux ? You can use the download manager of Firefox which now supports pausing and resuming downloads. For files of small size, the inbuilt download manager of Firefox will do the job. But what happens when the file you want to download is huge - such as a movie or the latest Linux distribution ISO ? Many times the files I am downloading via the Firefox download manager have gotten corrupted or have hung mid way through the download. And I had to start downloading all over again.

The most common and fail safe method of downloading huge files in Linux is to use the wget command line tool. wget supports resuming of interrupted downloads. By using the -c option, you can resume the download of the file at a later stage in the event the downloading fails due to connection time out.

I usually use the following wget command to download Linux distributions.

$ wget -c full_path_to_the_linux_iso

There is an interesting way of speeding up your downloads (accelerating the downloads) using another command line tool named curl.

A download accelerator in Linux

A download accelerator is a software that connects to more than one location simultaneously and splits the downloads among locations. This feature is commonly found in many download managers for Windows.

The following tip will help you speed up your download of files by a significant factor.

You will benefit from this tip if you have a large bandwidth internet connection - upwards of 4Mbps.

Usually different mirrors have different speeds. A Linux distribution mirror in say Japan may have a 100MBps connection but another mirror in a different location will be connected only to a 10 MBps pipe. More over these mirrors throttle the bandwidth made available for each individual connection thus providing a upper limit of download speeds.

The Technique

Split the file you want to download into a number of pieces.

Lets say you want to download a Linux ISO. Before you start downloading the file, you will know it's size. Say it is 700MB = 700 x 1000 KB = 700 x 1000 x 1024 bytes size.

What you can do is split the file (700MB) you are downloading into a number of pieces and download each piece from a different mirror simultaneously. At the end of the download, you can combine all the pieces together to get your file in one piece.

Use the curl command line program to download individual pieces simultaneously as shown below.

I split the ISO file into 4 equal parts and start downloading the first part as follows -

$ curl --range 0-199999999 -o ubuntu-iso.part1 $url1 &

Ampersand & at the end of the command denotes you want to run the program in the background.

Do the same for the next three parts of the ISO file as shown below.

$ curl --range 200000000-399999999 -o ubuntu-iso.part2 $url2 &

$ curl --range 400000000-599999999 -o ubuntu-iso.part3 $url3 &

$ curl --range 600000000- -o ubuntu-iso.part4 $url4 &

This creates four background download processes, each transferring a different part of the ISO image from a different server.

--range specifies a subrange of bytes to extract from the ISO file.

-o sets the name of the file to which the data is to be saved.

So once all the four curl processes finish their download, you will have four files namely -

ubuntu-iso.part1, ubuntu-iso.part2, ubuntu-iso.part3 and
ubuntu-iso.part4 in your current directory.

The $url1, $url2, $url3 and $url4 are user defined variables that hold the address of 4 different locations on the internet from where you are downloading the ISO file.

You can define the $url variables as follows.

# Inside the script ...

Once all the parts have finished downloading, you have to combine the parts to form the whole ISO.

So to get the original Ubuntu ISO file, I just combine the files using the cat command as follows :

$ cat ubuntu-iso.part? > ubuntu-7.04-desktop-i386.iso

Make it simple

To simplify the whole process of spliting, downloading, and joining files, you can enter the commands into a file (I have named it - da.sh) and execute the same.

#FILE NAME : da.sh (Download accelerator)


curl --range 0-199999999 -o ubuntu-iso.part1 $url1 &
curl --range 200000000-399999999 -o ubuntu-iso.part2 $url2 &
curl --range 400000000-599999999 -o ubuntu-iso.part3 $url3 &
curl --range 600000000- -o ubuntu-iso.part4 $url4 &

... set the executable bit of the file ...

$ chmod u+x da.sh

and then run it.

$ ./da.sh

Hope you enjoyed this nice tip of using curl command line tool to accelerate the downloads in Linux.


  • Tucanae Services

    interesting technique, but why not just torrent the file and be done with it?

  • I'd like to see some tips about graphical download managers, like Downloader for X.

  • thanks for sharing the information.

  • Ravi, thanks for the tip! This is very useful...

    agree torrents are great for these kind of downloads however some ISPs are "torrent-allergic" and shape you like mad as soon as some very nasty device in their network detects you're running a P2P protocol... You're lucky if this is not your case...

    Cheers again!

  • I use axel which is a very nice download accelerator too.

    sudo apt-get axel

    To integrate any command line download manager with Firefox get the Firefox plug in "flashgot" then go to Tools > FlashGot > More Options. Add the name "Axel" and its path /usr/bin/axel and leave the command line option as [URL] and you are all set!

    This same thing could be done for curl too.

  • Woow...
    Thanks for this post.
    Very useful.

  • SKDownloader - Has a great gui, has acceleration features, is free.

  • AndresVia

    Try aria2:


    Allows mirror and segmented download per mirror, mirrors are specified on Command Line on the other side axel mirror search service not always returns more server (at least for me).