On Wed, May 28, 2003 at 11:56:12AM +0530, Rohit wrote: : Hi, : : What do I have to do if I wish to get a directory tree home using wget? : Recursively? I have tried all - but I do not know how to specify a tree : for download. When I tell it a file, it gets it. When I tell it a : directory, it gets only an artificial index.html and that is it. : : Can it even be done? I am trying to mirror ftp.suse.com's 8.2/i386 branch : on my local machine, for installations etc and having a hard time writing : a script which would get INDEX.gz from SuSE, and then construct a file : list, and try to fetch every absent file [locally] from any of the 12-13 : mirrors in a roundrobin fashion. There is only one download stream from a : site at a time. But all sites are busy all the time. I think that is the : way for maximum throughput on one connection per site. Take a look at the "--mirror" option. Depending on the file layout you want on the local machine you may also be interested in the '-nH' and '--cut-dirs=' options. Full details can be found in the wget(1) man and info pages. However, it may not be what you really want. You can't effectively mirror from multiple servers at the same time. You'll end up stepping on your own toes as the multiple processes try to determine the ever-changing differences between the local and remote filesystems. You're probably better off taking the big hit the first time and then keeping up with a nightly or weekly cron job. --Jerry -- Open-Source software isn't a matter of life or death... ...It's much more important than that!