On Wed, Sep 11, 2019 at 2:36 PM Linux Kamarada
I'm now thinking on an algorithm like this:
1 Download everything from OBS (using wget, as above) 2 For each file stored locally: 2.1 Check whether the file exists online (for example, trying to download it and receiving HTTP 304 Not Modified or HTTP 404 Not Found, maybe that is possible using wget too) 2.1.1 If it does not exist online, delete it
Here it is: https://paste.opensuse.org/45482697 Alternative: https://gist.github.com/vinyanalista/ed031071151d00c5af4d6bdd49374680 Important to note: actually, with the -m (--mirror) option, wget does not "download everything", as I said. It does not redownload files that have not changed. wget only downloads newer files which have not been downloaded yet. Also, wget does not exclude local files that have been deleted remotely, that's what I do manually next with the help of curl testing 404 errors. Have a lot of fun! Antonio The Linux Kamarada Project http://kamarada.github.io/ -- To unsubscribe, e-mail: heroes+unsubscribe@opensuse.org To contact the owner, e-mail: heroes+owner@opensuse.org