Andrei Borzenkov wrote:
12.06.2018 08:39, Per Jessen пишет:
So this feature would need another adjustment for minimal speed to reject a repo.
Unless the current download code is able to download in parallel :-?
Yes, it normally does. A download is split into chunks, and these are fetched from multiple mirrors.
As far as I can tell default downloader in zypper is curl; can it really do it? This would require using aria as default.
It is done with segmented downloading, really just plain http. "get me a bit of this file, from x to y". I looked at this years back, because I was having trouble getting squid to cache the repos. A download is split into chunks of 256Kb, then spread across the available mirrors. -- Per Jessen, Zürich (17.8°C) http://www.hostsuisse.com/ - virtual servers, made in Switzerland. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org