Glad it worked out for you!  The real credit goes to the developers of wget though... that is one awesome little program.
Now that we got through the day without any more of those latency issues and we have some idea of what's behind the session problems, I feel pretty comfortable adding an extension for multiple file uploads.  I will do this tomorrow morning, and we'll take it from there.
 
Is there anyone on the list who is willing to work on a fix for the session problems?  I will have time to work on it with anyone who wants to.  Also, is there a bug open on the issue yet?  It would be best to have something in Bugzilla about this.
 
-Matt

>>> Tim Mohlmann <muhlemmer@gmail.com> 7/15/2010 3:50 PM >>>
>Absolutly AWSOME! I owe you big time.

>I finally got a working script though, but you will not be to happy
>about it, since it will load the server significant if everybody would
>use it. (for that reason I will not put it here!)

>It came to this: use the -A function for wget (allow a certain suffix,
>but it works also with complete file names) put a complete list in
>there, clean of File:, seperated by comma and then let it download
>recursive. This lets wget go through all the directories, but only
>downloads the actual files. I was testing it right now, and it is
>already working for 10 minutes or so. And this is only a relative
>small test list. (although the length of this list is not the time
>factor here). Also used the -L function to not follow hard links on
>the same server, meaning following the "Directory up" links ;)

>But it's became obsolete now. I can modify it though, to operate on
>the new temp structure and without the recursive thing!

>Thanks very very much!

>2010/7/15 Matthew Ehle <mehle@novell.com>:
> How does this look:
>
> http://files.opensuse.org/opensuse/tmp
>
> Sorry, I would have gotten this to you a lot sooner, but I had a heck of a
> time figuring out how to exclude the thumbnails directory.
>
> -Matt
>
>>>>> "Matthew Ehle" <mehle@novell.com> 7/15/2010 2:59 PM >>>
>>Tim,
>
>>>Well, how many gigs are we talking about? I prefer not to actually.
>
>>The answer is 1.4 GB.  If you don't want to do that, no worries.
>>>> Also, if the idea was for multiple people to be able to do something
>>>> like
>>>> this, I can even do something like create a temporary location on
>>>> files.o.o
>>>> with all the old-en files in one directory.
>
>>>This would be of GREAT help. If the position is known, it eliminates
>>>switch from new to old wiki all the time, to find and download your
>>>missing images. If we can get to multiple uploading, then it will be
>>>perfect:
>>>*Check tranferred articles
>>>*Paste all the red File: links in a text file
>>>*Download all the files in one time with wget
>>>*Upload them again in one time
>>>*10 pages fixed in no time!
>
>>I'll get it done.  I'll get back to you in hopefully just a little while.
>
>
>