[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The examples are classified into three sections, because of clarity. The first section is a tutorial for beginners. The second section explains some of the more complex program features. The third section contains advice for mirror administrators, as well as even more complex features (that some would call perverted).
7.1 Simple Usage Simple, basic usage of the program. 7.2 Advanced Usage Advanced techniques of usage. 7.3 Guru Usage Mirroring and the hairy stuff.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
wget http://fly.srk.fer.hr/ |
The response will be something like:
--13:30:45-- http://fly.srk.fer.hr:80/en/ => `index.html' Connecting to fly.srk.fer.hr:80... connected! HTTP request sent, awaiting response... 200 OK Length: 4,694 [text/html] 0K -> .... [100%] 13:30:46 (23.75 KB/s) - `index.html' saved [4694/4694] |
wget --tries=45 http://fly.srk.fer.hr/jpg/flyweb.jpg |
wget -t 45 -o log http://fly.srk.fer.hr/jpg/flyweb.jpg & |
The ampersand at the end of the line makes sure that Wget works in the background. To unlimit the number of retries, use `-t inf'.
$ wget ftp://gnjilux.srk.fer.hr/welcome.msg --10:08:47-- ftp://gnjilux.srk.fer.hr:21/welcome.msg => `welcome.msg' Connecting to gnjilux.srk.fer.hr:21... connected! Logging in as anonymous ... Logged in! ==> TYPE I ... done. ==> CWD not needed. ==> PORT ... done. ==> RETR welcome.msg ... done. Length: 1,340 (unauthoritative) 0K -> . [100%] 10:08:48 (1.28 MB/s) - `welcome.msg' saved [1340] |
wget ftp://prep.ai.mit.edu/pub/gnu/ lynx index.html |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
wget -i file |
If you specify `-' as file name, the URLs will be read from standard input.
wget -r -t1 http://www.gnu.ai.mit.edu/ -o gnulog |
wget -r -l1 http://www.yahoo.com/ |
wget -S http://www.lycos.com/ |
wget -s http://www.lycos.com/ more index.html |
wget -P/tmp -l2 ftp://wuarchive.wustl.edu/ |
wget -r -l1 --no-parent -A.gif http://host/dir/ |
It is a bit of a kludge, but it works. `-r -l1' means to retrieve recursively (see section 3. Recursive Retrieval), with maximum depth of 1. `--no-parent' means that references to the parent directory are ignored (see section 4.6 Directory-Based Limits), and `-A.gif' means to download only the GIF files. `-A "*.gif"' would have worked too.
wget -nc -r http://www.gnu.ai.mit.edu/ |
wget ftp://hniksic:mypassword@jagor.srce.hr/.emacs |
wget --dot-style=binary ftp://prep.ai.mit.edu/pub/gnu/README |
You can experiment with other styles, like:
wget --dot-style=mega ftp://ftp.xemacs.org/pub/xemacs/xemacs-20.4/xemacs-20.4.tar.gz wget --dot-style=micro http://fly.srk.fer.hr/ |
To make these settings permanent, put them in your `.wgetrc', as described before (see section 6.4 Sample Wgetrc).
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
crontab 0 0 * * 0 wget --mirror ftp://ftp.xemacs.org/pub/xemacs/ -o /home/me/weeklog |
wget --mirror -A.html http://www.w3.org/ |
wget -rN -Dsrce.hr http://www.srce.hr/ |
Now Wget will correctly find out that `regoc.srce.hr' is the same as `www.srce.hr', but will not even take into consideration the link to `www.mit.edu'.
wget -k -r URL |
wget -O - http://jagor.srce.hr/ http://www.srce.hr/ |
You can also combine the two options and make weird pipelines to retrieve the documents from remote hotlists:
wget -O - http://cool.list.com/ | wget --force-html -i - |
[ << ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |