PDA

View Full Version : Using wget to download full website



pepribal
29th June 2007, 05:15 PM
Hi. I'd like to download a full wiki website so that I can access the Blender official manual offline.

I think the best way is using wget, but I haven't got it right. I've used:

wget -l100 --convert-links -p -r http://wiki.blender.org/index.php/Manual

but it doesn't yield good links: so when I click any link it says that it can't find the page.

Is there a way I can download a full working copy, either with wget or any other tool?

Thanks.

SlowJet
29th June 2007, 06:36 PM
wget is a pretty cool tool. :)

The problem is in the subtile differences of the -p -l -r parms and wget has some extra combos to use depending on what you want from the external site.

Here is the auth's recommendation

wget -E -H -k -K -p http://<site>/<document>

#I understand this to do whatever it takes to get the <document> to fully display.
# but that may not go deep enough so you will need different set of parms to continue -r and -l <depth> and -p , to pick up non area atag stuff like inlines and styles so maybe a --mirror parm.
# It's not exactly lite reading is it? :)

SJ