Normally you would be downloading files from the internet using your browser and its own download manager would be the right choice. However, what if you need to download files on the server which has command-line (terminal) only or you want to download a long list of files but one at a time or you want to limit the maximum download speed so that it doesn’t eat up all your limit. These are just a few of the cases in which wget command comes in handy.
For Linux, the fun part is that it is already installed on every Linux machine. You can verify this running the command:
For Windows, there is a download available here. For simplicity, select the option labelled “Complete package, except sources”. Once downloaded, install it. The installation path on Windows XP is “C:\Program Files\GnuWin32\bin” but you can add it to the PATH environment variable to make it easily accessible.
For MAC OS, it doesn’t come installed but getting it is easy. Before installing wget, you need to get HomeBrew, the package manager for MAC OS. Go to http://brew.sh/ and install it by simply running the command mentioned there.
Once HomeBrew is installed, run the following command to install wget:
brew install wget
Once it is installed and ready, we can get started
Downloading a Single File
This will start downloading the file to the current folder in which the Terminal prompt is. Besides downloading, it will display the progress in percentage, the download speed, total bytes downloaded and remaining time.
Downloading a Single File to a Different Location
The file downloaded in above command will be saved in the current working directory of the Terminal and with the name file.zip as it was its original name.
To save it with a different name:
wget -O new_name.zip http://dummyurl.com/file.zip
To save it in a different location, give the path also (which can be relative of absolute):
wget -O /home/user1/new_name.zip http://dummyurl.com/file.zip
Resume Download for a File
If your file was partially downloaded, you can resume it by using the -c switch, which is for continue.
wget -c http://dummyurl.com/file.zip
To Limit the Max Download Rate/Speed
wget --limit-rate=50K http://dummyurl.com/file.zip
This will not download at speed more than 50KB per second.
Download Files in the Background
If there are numerous task and you want that wget keeps on downloading them in the background:
wget -b http://dummyurl.com/file.zip
Download Multiple Files from a List of URLs
The wget command can take input URLs from a text file with a URL on every line and it will pipeline them for downloading:
wget -i list_of_urls.txt
These are only some of the scenarios I had to come across. I will inshAllah be updating the list of commands as the need arises.