• Download Single File
wget http://www.openss7.org/repos/tarballs/strx25-0.9.2.1.tar.bz2
  • Continue the Incomplete Download
wget -c http://www.openss7.org/repos/tarballs/strx25-0.9.2.1.tar.bz2
  • Download and Store With a Different File name
wget -O taglist.zip http://www.vim.org/scripts/download_script.php?src_id=7701
  • Download in the Background
wget -b http://www.openss7.org/repos/tarballs/strx25-0.9.2.1.tar.bz2
  • Mask User Agent and Display wget like Browser
wget --user-agent="Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.3) Gecko/2008092416 Firefox/3.0.3" URL-TO-DOWNLOAD
  • Increase Total Number of Retry Attempts

If the internet connection has problem, and if the download file is large there is a chance of failures in the download. By default wget retries 20 times to make the download successful. If needed, you can increase retry attempts using –tries option as shown below.

wget --tries=75 DOWNLOAD-URL
  • Download Multiple Files / URLs

First, store all the download files or URLs in a text file as:

cat < download-file-list.txt
URL1
URL2
URL3
URL4

Next, give the download-file-list.txt as argument to wget using -i option as shown below.

wget -i download-file-list.txt
  • Download a Full Website
wget --mirror -p --convert-links -P ./LOCAL-DIR WEBSITE-URL

–mirror : turn on options suitable for mirroring.
-p : download all files that are necessary to properly display a given HTML page.
–convert-links : after the download, convert the links in document for local viewing.
-P ./LOCAL-DIR : save all the files and directories to the specified directory.

  • Reject Certain File Types while Downloading

You have found a website which is useful, but don’t want to download the images you can specify the following.

wget --reject=gif WEBSITE-TO-BE-DOWNLOADED
  • Download Only Certain File Types
wget -r -A.pdf http://url-to-webpage-with-pdfs/
  • FTP download using wget with username and password authentication
wget --ftp-user=USERNAME --ftp-password=PASSWORD DOWNLOAD-URL