On 15/03/2019 07.53, Per Jessen wrote:
Anton Aylward wrote:
Oh buqquer, I hate that sort of manual work. Why isn't zypper using cURL under the hood so that this can be automated?
No, wait a moment: # ldd /usr/bin/zypper | grep curl libcurl.so.4 => /usr/lib64/libcurl.so.4 (0x00007fa18c6f0000)
and
# ldd /usr/lib64/libzypp.so.1600 | grep curl libcurl.so.4 => /usr/lib64/libcurl.so.4 (0x00007f522a353000)
It *IS* using the curl library. So why can't zypper do a restart?
Just one of those things that were not considered - a mostly working internet connection. It doesn't sound too difficult to do though: if download is incomplete, just pass the current file size to the libcurl invocation.
The reason is clear from comments from Benjamin Zeller and Andrei Borzenkov: the partial download is stored with a randomized name (why? I don't know/guess). Restart zypper and it uses another name. Zypper does multiple download threads (as I thought), which is able to cope with interrupted downloads at the remote end, but not all threads failing, on the local side. Curious. It seems that the application should recover if left alone sufficient time, though. I wonder if limiting the number of simultaneous downloads would help or if it is possible. -- Cheers / Saludos, Carlos E. R. (from 15.0 x86_64 at Telcontar)