wget β File Downloader#
Basic downloads#
wget https://example.com/file.tar.gz # download to cwd
wget -O output.tar.gz https://example.com/f # custom filename
wget -P /tmp https://example.com/file.tar.gz # save to /tmp/
wget -q https://example.com/file # quiet (no progress bar)
wget -nv https://example.com/file # no verbose (minimal output)
wget -S https://example.com # show server response headers
wget --spider https://example.com/file # check URL without downloading
Resume and retry#
wget -c https://example.com/bigfile.tar.gz # continue/resume download
wget --retry-connrefused -t 10 URL # retry up to 10 times
wget -t 0 URL # infinite retries
wget -w 5 -t 10 URL # wait 5s between retries
wget --timeout=30 URL # connection/read timeout
Authentication#
wget --user=alice --password=secret https://protected.example.com/file
wget --ask-password --user=alice https://example.com/file
wget --http-user=alice --http-password=s3cr3t https://example.com
wget --no-http-keep-alive --user=alice ... # disable keepalive
# .netrc file (~/.netrc, chmod 600)
# machine example.com login alice password secret
wget --netrc-file=~/.netrc https://example.com/private/file
Headers and cookies#
wget --header="Authorization: Bearer TOKEN" https://api.example.com
wget --header="Accept: application/json" URL
wget --save-cookies cookies.txt \
--post-data "user=alice&pass=secret" \
https://example.com/login
wget --load-cookies cookies.txt https://example.com/protected
wget --keep-session-cookies --save-cookies cookies.txt URL
Download speed control#
wget --limit-rate=500k URL # limit to 500 KB/s
wget --limit-rate=2m URL # limit to 2 MB/s
wget -w 1 URL # wait 1 second between requests
wget --random-wait URL # random wait 0.5β1.5Γ -w value
TLS / SSL#
wget --no-check-certificate URL # skip cert verification
wget --ca-certificate=/path/to/ca.pem URL
wget --certificate=client.pem --private-key=client.key URL
Batch and list downloads#
wget -i urls.txt # download all URLs from file
wget -i urls.txt -P /downloads/ # save all to directory
wget -q -i urls.txt -P /dest/ & # background batch download
# Generate URL list and download
seq 1 100 | sed 's|.*|https://example.com/page/&|' | wget -i - -P pages/
Recursive / mirror#
# Download a full website
wget --mirror -p --convert-links \
--no-parent -P ./site-mirror \
https://docs.example.com/
# Flags explained:
# --mirror = -r -N -l inf --no-remove-listing
# -p = download all assets (CSS, images, JS)
# --convert-links = rewrite links for offline use
# --no-parent = don't go above the given directory
# Recursive, limited depth
wget -r -l 2 https://example.com/docs/
# Download only specific file types
wget -r -l 2 -A "*.pdf,*.doc" https://example.com/resources/
# Exclude file types
wget -r -l 2 -R "*.jpg,*.png,*.gif" https://example.com/
# Stay within the same domain
wget -r -H -D example.com https://example.com/
Recursive options reference#
| Flag | Meaning |
|---|
-r / --recursive | Recursive download |
-l N | Recursion depth (default 5; inf = unlimited) |
-np / --no-parent | Donβt go up to parent directories |
-N | Only download newer files (timestamping) |
-k / --convert-links | Convert links for local browsing |
-p / --page-requisites | Get all assets needed to display page |
-H | Span hosts (follow links to other domains) |
-D DOMAIN | Comma-separated domains to follow |
-A LIST | Accept list (file patterns/extensions) |
-R LIST | Reject list |
-I LIST | Include directories |
-X LIST | Exclude directories |
--no-clobber | Donβt overwrite existing files |
Output and logging#
wget -a wget.log URL # append log to file
wget -o wget.log URL # write log to file (overwrite)
wget --progress=bar URL # progress bar style
wget --progress=dot:giga URL # dot progress for big files
wget -q --show-progress URL # quiet + progress bar
Timestamps and conditional fetch#
wget -N URL # only download if newer than local copy
wget --no-if-modified-since URL # always download
Background and daemon#
wget -b URL # download in background
wget -b -o background.log URL # background with log
tail -f wget-log # watch background download progress
FTP support#
wget ftp://ftp.example.com/pub/file.tar.gz
wget --ftp-user=alice --ftp-password=secret ftp://ftp.example.com/file
wget -r ftp://ftp.example.com/pub/ # recursive FTP
wget vs curl#
| Task | wget | curl |
|---|
| Simple download | wget URL | curl -LO URL |
| Save with custom name | wget -O name URL | curl -o name URL |
| Resume | wget -c URL | curl -C - URL |
| Batch from file | wget -i list.txt | xargs -n1 curl -LO < list.txt |
| Recursive mirror | wget --mirror | not built-in |
| API / REST calls | limited | curl -X POST -d ... |
| Pipe to stdout | wget -qO- URL | curl -sS URL |
[!TIP]
For simple scripted downloads, wget -q --show-progress -c -O "$dest" "$url" is the most reliable combination: quiet (no clutter), shows a progress bar, resumes if interrupted, and saves to a named file.