On 2018-08-05 17:36, Klaus Vink Slott wrote:
Hi
I do a lot of testing at home and experimenting so I often have something like 10 virtual installs with opensuse running. Up to recently I had a unofficial mirror for base and update repositories. A simple cron job every night used wget to sync my local repository. By configuring my vm's to use this local repository I avoided downloading the same file again for each vm, during repeated install experiments and system updates.
Now I am looking to replace my local mirror with something more flexible, so suggestions, best practice, pointers to examples is welcome. I not sure, but think that I'v seen a article on some Opensuse site, describing a solution based on nfs mounting the download directory on all hosts.
Yes, that is what I do, but your system is better. If you want to use it, I'll explain how it goes. The advantage is that it only stores what is used, and older versions of packages that were used before.
Currently I see 3 possibilities:
* continue to host a local mirror of selected repositories
This works well, but has to store whole repositories.
* configure some kind of nfs sharing between all hosts
This works half well. Problem is that it is easy to delete the copy by accident.
* add squid to my firewall
squid is not easy to setup, but once done it should be automatic, and only store those files actually used.
each has pros and cons as I see it.
Some distributions have a system to create local caches automatically. We don't. -- Cheers / Saludos, Carlos E. R. (from 42.3 x86_64 "Malachite" at Telcontar)