On 06-08-2018 08:43, Per Jessen wrote:
Klaus Vink Slott wrote:
On 05-08-2018 21:31, Carlos E. R. wrote:
* configure some kind of nfs sharing between all hosts This works half well. Problem is that it is easy to delete the copy by accident. I see. I'm also a bit worried what will happen if 2 computers decides to download the same file simultaneously. No problem in that.
As Carlos already pointed out, I was a bit brief in my description of the NFS solution.
* add squid to my firewall squid is not easy to setup, but once done it should be automatic, and only store those files actually used.
I have just tried to add squid as a transparent http proxy to my firewall (I use pfsense) ... It won't work as expected :-) Damm.. :-) Things often looks more simple from the outside.
zypper uses segmented downloads. Each rpm is split into 256K chunks which are then downloaded in parallel from multiple mirrors. That is quite a clever, but of course that also makes my simple fix worthless.
It can be made to work perfectly well, but it takes a little more effort
http://wiki.jessen.ch/index/How_to_cache_openSUSE_repositories_with_Squid Thanks for the link, I will dig into your blogpost and decide which path to follow. I kind of like the idea that using squid I do not need to change anything on the target.
-- Regards Klaus -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org