Re: [SLE] wget Question
On 12/22/05, Kai Ponte <kai@perfectreign.com> wrote:
This has been bugging me.
I want to do a wget to download a local version of a web page and all it's subcontents. However, I don't want anything above or outside of the tree. Is there anyone who can help?
For example, I want this page and all sub-linked pages:
http://java.sun.com/docs/books/tutorial/jdbc/basics/index.html
I typed in wget -m http://java.sun.com/docs/books/tutorial/jdbc/basics/index.html
but then I get java.sun.com, www.sun.com, specs.org, developer.sun.com, bugs.sun.com...
Take a look at -D option (set the domains you want to follow), or --exclude-domains.
Also, look at -l (--level-depth) - to set up how deep to go.
-L is useful as well.
There a lot more, just check the man page.
Heh - that's what got me in trouble in the first place! Remember, friends don't let friends read man pages. Thanks to you and Carlos for providing me with the answer. Got it now! -- kai ponte www.perfectreign.com linux - genuine windows replacement part
participants (1)
-
Kai Ponte