How to Use wget
to Download Files (with Advanced Configuration for Speed) in Linux
Table of Contents
- Introduction
- Step 1: Basic Usage of
wget
π₯ - Step 2: Resume Interrupted Downloads π
- Step 3: Downloading Multiple Files in Bulk π
- Step 4: Speeding Up Downloads with Parallel Connections π
- Step 5: Set Up a Supercharged Alias (
sget
) βοΈ - Common
wget
Options (Cheat Sheet) π - Conclusion π―
1. Introduction
wget
is a powerful command-line tool in Linux used to download files from the web. Itβs especially useful for automated scripts and advanced file downloads. This guide will cover basic and advanced uses of wget
, focusing on optimizing speed.
2. Step 1: Basic Usage of wget
π₯
To download a file using wget
, the simplest command is:
For example:
This downloads the file to your current directory.
3. Step 2: Resume Interrupted Downloads π
If your download is interrupted, you can resume it by using the -c
(continue) option:
This resumes the download from where it left off.
4. Step 3: Downloading Multiple Files in Bulk π
You can download multiple files at once by listing URLs in a text file and using wget
with the -i
option:
- Create a text file (e.g.,
urls.txt
) containing the list of URLs:
- Download all files in the list:
5. Step 4: Speeding Up Downloads with Parallel Connections π
wget
doesnβt support parallel downloads natively, but you can still achieve this using the --limit-rate
, --no-check-certificate
, and other options:
This command does the following:
-c
: Continues from where it left off.--limit-rate=2M
: Limits the download speed to 2MB/s, which can stabilize and prevent connection issues.--no-check-certificate
: Skips SSL certificate verification (use with caution).--tries=3
: Retries up to 3 times if the download fails.
6. Step 5: Set Up a Supercharged Alias (sget
) βοΈ
To create an alias called sget
(superget) with optimized options for faster and more resilient downloads:
- Open your
.bashrc
or.zshrc
file:
- Add the following alias:
alias sget='wget -c --limit-rate=2M --no-check-certificate --tries=5 --timeout=10 --no-clobber --random-wait --retry-connrefused'
Explanation of the options:
-c
: Resume downloads.--limit-rate=2M
: Cap download speed at 2MB/s for stability.--no-check-certificate
: Skip SSL checks for faster downloads (use with caution).--tries=5
: Retry up to 5 times on failure.--timeout=10
: Set a timeout of 10 seconds for slow responses.--no-clobber
: Prevent overwriting existing files.--random-wait
: Randomizes wait times between retries to avoid server overload.-
--retry-connrefused
: Retry if the connection is refused. -
Save and exit the editor (
Ctrl + O
,Enter
, thenCtrl + X
). -
Apply the changes:
Now, you can use sget
as a shortcut for optimized wget
downloads:
7. Common wget
Options (Cheat Sheet) π
Hereβs a table of some useful wget
options:
Option | Description |
---|---|
-c, --continue |
Resume broken downloads |
-i <file> |
Download files listed in a text file |
-b, --background |
Run wget in the background |
--limit-rate=<rate> |
Limit download speed (e.g., 2M for 2MB/s) |
--no-check-certificate |
Skip SSL certificate verification (use cautiously) |
--tries=<number> |
Set the number of retries for failed downloads |
--timeout=<seconds> |
Set a timeout period for unresponsive servers |
--no-clobber |
Avoid overwriting files |
--random-wait |
Randomize wait times between requests to avoid overloading servers |
--retry-connrefused |
Retry downloads if the connection is refused |
-r, --recursive |
Download websites or directories recursively |
--mirror |
Create a mirror of a website (includes -r , -N , -l inf , and -np options) |
8. Conclusion π―
wget
is a versatile tool for downloading files in Linux. With a few tweaks, like the sget
alias, you can optimize download speed and reliability. Whether youβre downloading single files or managing bulk downloads, wget
can handle it all with ease.