I have been downloading this publication for years, and have never encountered difficulty doing so before the current issue. Suddenly, my system can't hold a connection long enough to d/l more than a few bytes. The publisher assures me that all is well at his end, and any problem must be with the ISP. He may well be right, since for quite a while I have been experiencing disconnects with certain web sites, notably CNN.COM. I cannot imagine why such a problem should involve only certain sites, but think the problem is likely to occur with large downloads from other servers as well. The obvious solution would seem to be to use Kget to manage the operation. I have integrated Kget with Konqueror and attempted the download. At that point, the small Kget screen (this is not the Kget screen containing the source URL and the file name, but a reduces one asking, as far as I remember, only for what to do with the file) disappeared, and I see no Net activity in the router LEDs. In other words, I have no clue what, if anything, Kget is doing or how much, if any, it has downloaded. Nor do I see anything way to make a decision (e.g. start again, give up, keep at it,...). I know that it is running, because bash tells me so when I try to start it again. I'd like some advice on what to do when Kget seems not to be doing anything, not reporting its condition, and not offering any way to control it. Any comments on how to try to correct the dropout problem will be greatly appreaciated. -- Stan Goodman Qiryat Tiv'on Israel -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org