--- Phil Driscoll
On Tuesday 01 March 2005 18:37, Rob Keeling wrote:
Has anyone done this, and if so how? Careful (VERY careful!) use of 'wget --recursive' allows you to get
Hmmm. I would have used curl(1) rather than wget, for all of the inherent disasters that you'd incur with using the -r switch to wget. YMMV, etc., etc.
entire websites cached in squid without someone having to physically click on all the links on a site.
There are other ways to do this, mind. It really does depends on the number of pages. -- Thomas Adam ===== "The Linux Weekend Mechanic" -- http://linuxgazette.net "TAG Editor" -- http://linuxgazette.net "<shrug> We'll just save up your sins, Thomas, and punish you for all of them at once when you get better. The experience will probably kill you. :)" -- Benjamin A. Okopnik (Linux Gazette Technical Editor) Send instant messages to your online friends http://uk.messenger.yahoo.com