20 Sep
2001
20 Sep
'01
05:49
A question releated to this; is it possible with e.g. wget to determine the number of files that it is going to download. So a functionality comparable with find . -print, but than for an ftp connection....? Op woensdag 19 september 2001 23:06, schreef dog@intop.net:
wget -r http://www.site.com
On Wed, 19 Sep 2001, Linux - User wrote:
I need to download some pages from a website... like 180 pages... their are on Plain HTML
I also need to pass them to an StarOffice Document...
did any one know a tool for that ? a spider maybe ?
or ANY OTHER IDEA ???
-- Richard Bos For those who have no home the journey is endless