[openFATE 120326] Resume download
Feature changed by: Dave Plater (plater) Feature #120326, revision 89 Title: Resume download openSUSE-10.2: Rejected by Stanislav Visnovsky (visnov) reject date: 2006-09-21 09:56:17 reject reason: Not enough resources to implement in time. Priority Requester: Desirable Projectmanager: Desirable openSUSE-10.3: Rejected by Stanislav Visnovsky (visnov) reject date: 2007-07-25 15:20:32 reject reason: Out of time. Postponing. Priority Requester: Desirable Projectmanager: Desirable openSUSE-11.0: Rejected by Jiri Srain (jsrain) reject date: 2008-03-28 13:51:03 reject reason: Out of resources for 11.0. Priority Requester: Desirable Projectmanager: Important openSUSE-11.1: Rejected by Stanislav Visnovsky (visnov) reject date: 2008-07-01 11:34:46 reject reason: Postponing, needs downloading refactor. Priority Requester: Desirable openSUSE-11.2: Evaluation Priority Requester: Desirable Projectmanager: Desirable Requested by: Klaus Kämpf (kwk) Description: YaST/YOU times out too easy when downloading large packages like kde- base (14 MB) over a single ISDN connection. Please cache the half downloaded package so I don't have to start from the beginning again. See http://bugzilla.novell.com/show_bug.cgi?id=suse9740 (http://bugzilla.novell.com/show_bug.cgi?id=suse9740) http://bugzilla.novell.com/show_bug.cgi?id=278507 (http://bugzilla.novell.com/show_bug.cgi?id=278507) https://bugzilla.novell.com/show_bug.cgi?id=545242 Discussion: #1: Gerald Pfeifer (geraldpfeifer) (2006-06-30 17:40:31) Klaus, do you now whether this is still an issue? #2: Klaus Kämpf (kwk) (2006-06-30 18:38:23) We still have very large packages (kernel, OpenOffice_org) which might not download completely in one go. #5: Milisav Radmanic (radmanic) (2006-09-08 15:25:57) (reply to #4) this applies to the media manager and Jiri already agreed to return this back to the YAST team, since Marius only helped out during CODE 10. Marius will of course help with the implementation, by sharing his knowledge. #7: Stanislav Visnovsky (visnov) (2007-11-23 10:32:46) Related to commit-refactoring. #9: Federico Lucifredi (flucifredi) (2008-06-12 20:21:33) Klaus, are you still running ISDN? just kidding :-) Stano, please your opinion on workload - is this easily achievable? if not, there are hgher priorities. #10: Stanislav Visnovsky (visnov) (2008-06-20 13:10:21) (reply to #9) Jiri, could we get estimate for this? #11: Jiri Srain (jsrain) (2008-08-04 08:59:34) (reply to #10) Since curl itself does support resuming download, then this feature should not be hard to be implemented. #12: Ruchir Brahmbhatt (ruchir) (2009-01-17 12:07:25) I also vote for this feature. #13: Dmitry Mittov (michael_knight) (2009-01-19 09:01:09) It is also a great problem when you use slow mirror. download.opensuse. org redirects me to one of yandex mirrors (score 20). And I have timeout on big packages. #14: Piotrek Juzwiak (benderbendingrodriguez) (2009-01-21 18:17:13) I'd vote at least for a way to change the timeout settings for YaST or it doesn't solve the problem? #15: Alam Aby Bashit (init7) (2009-01-22 09:35:17) I'd like to vote this feature implemented in 11.2. You surely want to have resume capability if you have unreliable yet slow internet connection for say, updating KDE 4.2 :) #16: Duncan Mac-Vicar (dmacvicar) (2009-01-22 15:14:44) (reply to #15) Please stop this "I vote for this" or "+1" or "mee too" comments. There is no voting system in FATE yet, but following a discussion about "I want this too" makes hard to evaluate features. #17: James Mason (bear454) (2009-01-24 06:05:01) Could this be accomplished using a bittorrent backend instead of curl ? #18: Jan Engelhardt (jengelh) (2009-01-30 15:31:33) (reply to #17) ISDN is already slow as it is. I would not want to spend more time downloading just because of the metadata traffic that is going to happen. Not to mention what happens if there are no peers around or they configured themselves to upload-limit themselves. Still, most download.opensuse.org downloads are faster than a torrent for me. #19: Pascal Bleser (pbleser) (2009-03-02 08:42:26) Caching is one thing. But even using retries in curl would help, see "curl --retries". #20: Piotrek Juzwiak (benderbendingrodriguez) (2009-04-23 16:38:17) It could be accomplished by using aria for downloading packages? There would be no more problems with timeouts and bad checksums ? #21: Ján Kupec (jkupec) (2009-04-23 17:36:18) (reply to #20) Actually we are already using aria in current development branch, so this is not so urgent anymore. Still, the download can be interrupted also in other ways than connection timeout, e.g. user decision, sudden power outage, etc... Does anyone know whether aria supports resuming? (Implementing this for the curl backend is not important anymore). #22: Markus K (kamikazow) (2009-04-25 12:49:41) (reply to #21) Yes, aria supports resuming -- even better than wget (don't know about curl), because aria uses the file size to check whether the to-be- downloaded file changed. #23: Peter Poeml (poeml) (2009-04-28 09:14:12) (reply to #21) It does. See section "Resuming Download" in its man page (http://aria2.sourceforge.net/aria2c.1.html#_resuming_download) ; and also note the -c option. #24: T. J. Brumfield (enderandrew) (2009-06-12 23:35:43) Perhaps as an addendum to this feature, I'd like to see the option to set a number of automatic retries. If Yast tells me it needs to download 3 gigs worth of packages, I don't want to watch for one package to timeout, hanging the whole process up. I'd like to configure it so that it will automatically retry the package X times, then skip the package and move on. #25: Michal Papis (mpapis) (2009-07-04 21:14:32) (reply to #24) Good thing would be here to watch network status (maybe by ping'ing the download server each few minutes, and resume download after it is back again. #30: Eduard Avetisyan (dich) (2009-09-15 09:21:41) (reply to #24) In fact, the system should keep on trying to download this particular package, and in the meantime try to download the next package in the list. I've had cases where a single package would fail to download (for unknown reason, since the same URL worked fine in firefox on the same machine), while the rest of the packages went fine. I also find 'curl' less robust and featurefull than 'wget', I wonder why the first was chosen as a backend... #28: Kahenya Kamunyu (kahenya) (2009-08-29 22:03:55) Yup, this should be there. One place Ubuntu kicks openSUSE ass #29: Vincent Petry (pvince81) (2009-09-03 07:24:15) Here in China I also experience timeouts sometimes, even over a regular ADSL connection (512 Mbps). It happens with my home connection as well as at my company's office. Resuming download would definitely be helpful, because sometimes I have to resume the same big package several times before it can finish. + #31: Dave Plater (plater) (2009-10-08 13:45:48) + It must be so simple to allow aria2c to use it's resume download + feature, I don't understand why it hasn't been implemented yet. I've + just completed a three day 2.1 gig zypper dup on a beta trial adsl line + which caused many failures. The new download in advance feature would + have made zypper into a super upgrade package if it wasn't blemished by + libzypps inability to resume interupted downloads. I had to use aria2c + manually to download two large packages directly into libzypps package + cache to complete the upgrade. + see https://bugzilla.novell.com/show_bug.cgi?id=545242 + unfortunately the bug has been closed due to this feature request but + not using aria2c's resume feature spoils an otherwise streamlined + zypper -- openSUSE Feature: https://features.opensuse.org/120326
participants (1)
-
fate_noreply@suse.de