skip to content

wget β€” File Downloader

Non-interactive network downloader. Covers single and batch downloads, recursive mirroring, authentication, resuming, rate limiting, and site archiving.

4 min read 12 snippets 2d ago quick read

wget β€” File Downloader#

Basic downloads#

wget https://example.com/file.tar.gz         # download to cwd
wget -O output.tar.gz https://example.com/f  # custom filename
wget -P /tmp https://example.com/file.tar.gz # save to /tmp/
wget -q https://example.com/file             # quiet (no progress bar)
wget -nv https://example.com/file            # no verbose (minimal output)
wget -S https://example.com                  # show server response headers
wget --spider https://example.com/file       # check URL without downloading

Resume and retry#

wget -c https://example.com/bigfile.tar.gz   # continue/resume download
wget --retry-connrefused -t 10 URL           # retry up to 10 times
wget -t 0 URL                                # infinite retries
wget -w 5 -t 10 URL                          # wait 5s between retries
wget --timeout=30 URL                        # connection/read timeout

Authentication#

wget --user=alice --password=secret https://protected.example.com/file
wget --ask-password --user=alice https://example.com/file
wget --http-user=alice --http-password=s3cr3t https://example.com
wget --no-http-keep-alive --user=alice ...   # disable keepalive

# .netrc file (~/.netrc, chmod 600)
# machine example.com login alice password secret
wget --netrc-file=~/.netrc https://example.com/private/file

Headers and cookies#

wget --header="Authorization: Bearer TOKEN" https://api.example.com
wget --header="Accept: application/json" URL

wget --save-cookies cookies.txt \
     --post-data "user=alice&pass=secret" \
     https://example.com/login
wget --load-cookies cookies.txt https://example.com/protected
wget --keep-session-cookies --save-cookies cookies.txt URL

Download speed control#

wget --limit-rate=500k URL                  # limit to 500 KB/s
wget --limit-rate=2m URL                   # limit to 2 MB/s
wget -w 1 URL                              # wait 1 second between requests
wget --random-wait URL                     # random wait 0.5–1.5Γ— -w value

TLS / SSL#

wget --no-check-certificate URL            # skip cert verification
wget --ca-certificate=/path/to/ca.pem URL
wget --certificate=client.pem --private-key=client.key URL

Batch and list downloads#

wget -i urls.txt                            # download all URLs from file
wget -i urls.txt -P /downloads/             # save all to directory
wget -q -i urls.txt -P /dest/ &            # background batch download

# Generate URL list and download
seq 1 100 | sed 's|.*|https://example.com/page/&|' | wget -i - -P pages/

Recursive / mirror#

# Download a full website
wget --mirror -p --convert-links \
     --no-parent -P ./site-mirror \
     https://docs.example.com/

# Flags explained:
# --mirror        = -r -N -l inf --no-remove-listing
# -p              = download all assets (CSS, images, JS)
# --convert-links = rewrite links for offline use
# --no-parent     = don't go above the given directory

# Recursive, limited depth
wget -r -l 2 https://example.com/docs/

# Download only specific file types
wget -r -l 2 -A "*.pdf,*.doc" https://example.com/resources/

# Exclude file types
wget -r -l 2 -R "*.jpg,*.png,*.gif" https://example.com/

# Stay within the same domain
wget -r -H -D example.com https://example.com/

Recursive options reference#

FlagMeaning
-r / --recursiveRecursive download
-l NRecursion depth (default 5; inf = unlimited)
-np / --no-parentDon’t go up to parent directories
-NOnly download newer files (timestamping)
-k / --convert-linksConvert links for local browsing
-p / --page-requisitesGet all assets needed to display page
-HSpan hosts (follow links to other domains)
-D DOMAINComma-separated domains to follow
-A LISTAccept list (file patterns/extensions)
-R LISTReject list
-I LISTInclude directories
-X LISTExclude directories
--no-clobberDon’t overwrite existing files

Output and logging#

wget -a wget.log URL                       # append log to file
wget -o wget.log URL                       # write log to file (overwrite)
wget --progress=bar URL                    # progress bar style
wget --progress=dot:giga URL              # dot progress for big files
wget -q --show-progress URL               # quiet + progress bar

Timestamps and conditional fetch#

wget -N URL                    # only download if newer than local copy
wget --no-if-modified-since URL # always download

Background and daemon#

wget -b URL                    # download in background
wget -b -o background.log URL  # background with log
tail -f wget-log               # watch background download progress

FTP support#

wget ftp://ftp.example.com/pub/file.tar.gz
wget --ftp-user=alice --ftp-password=secret ftp://ftp.example.com/file
wget -r ftp://ftp.example.com/pub/                # recursive FTP

wget vs curl#

Taskwgetcurl
Simple downloadwget URLcurl -LO URL
Save with custom namewget -O name URLcurl -o name URL
Resumewget -c URLcurl -C - URL
Batch from filewget -i list.txtxargs -n1 curl -LO < list.txt
Recursive mirrorwget --mirrornot built-in
API / REST callslimitedcurl -X POST -d ...
Pipe to stdoutwget -qO- URLcurl -sS URL

[!TIP] For simple scripted downloads, wget -q --show-progress -c -O "$dest" "$url" is the most reliable combination: quiet (no clutter), shows a progress bar, resumes if interrupted, and saves to a named file.