[opensuse] Local rpm storage/cache
Hi I do a lot of testing at home and experimenting so I often have something like 10 virtual installs with opensuse running. Up to recently I had a unofficial mirror for base and update repositories. A simple cron job every night used wget to sync my local repository. By configuring my vm's to use this local repository I avoided downloading the same file again for each vm, during repeated install experiments and system updates. Now I am looking to replace my local mirror with something more flexible, so suggestions, best practice, pointers to examples is welcome. I not sure, but think that I'v seen a article on some Opensuse site, describing a solution based on nfs mounting the download directory on all hosts. Currently I see 3 possibilities: * continue to host a local mirror of selected repositories * configure some kind of nfs sharing between all hosts * add squid to my firewall each has pros and cons as I see it. -- Regards Klaus -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
On 2018-08-05 17:36, Klaus Vink Slott wrote:
Hi
I do a lot of testing at home and experimenting so I often have something like 10 virtual installs with opensuse running. Up to recently I had a unofficial mirror for base and update repositories. A simple cron job every night used wget to sync my local repository. By configuring my vm's to use this local repository I avoided downloading the same file again for each vm, during repeated install experiments and system updates.
Now I am looking to replace my local mirror with something more flexible, so suggestions, best practice, pointers to examples is welcome. I not sure, but think that I'v seen a article on some Opensuse site, describing a solution based on nfs mounting the download directory on all hosts.
Yes, that is what I do, but your system is better. If you want to use it, I'll explain how it goes. The advantage is that it only stores what is used, and older versions of packages that were used before.
Currently I see 3 possibilities:
* continue to host a local mirror of selected repositories
This works well, but has to store whole repositories.
* configure some kind of nfs sharing between all hosts
This works half well. Problem is that it is easy to delete the copy by accident.
* add squid to my firewall
squid is not easy to setup, but once done it should be automatic, and only store those files actually used.
each has pros and cons as I see it.
Some distributions have a system to create local caches automatically. We don't. -- Cheers / Saludos, Carlos E. R. (from 42.3 x86_64 "Malachite" at Telcontar)
On 05-08-2018 21:31, Carlos E. R. wrote:
On 2018-08-05 17:36, Klaus Vink Slott wrote:
Hi ... Up to recently I had a unofficial mirror for base and update repositories. A simple cron job every night used wget to sync my local repository. By configuring my vm's to use this local repository I avoided downloading the same file again for each vm ... Now I am looking to replace my local mirror with something more flexible, so suggestions, best practice, pointers to examples is welcome. I not sure, but think that I'v seen a article on some Opensuse site, describing a solution based on nfs mounting the download directory on all hosts.
Yes, that is what I do, but your system is better.
If you want to use it, I'll explain how it goes. The advantage is that it only stores what is used, and older versions of packages that were used before.
Thanks. If you dont mind, I would like to keep that offer open, while I investigate using squid.
Currently I see 3 possibilities: * continue to host a local mirror of selected repositories This works well, but has to store whole repositories.
Yes and it also requires some monitoring to insure that the dl did not fail, that the mirror you are using is current, ect. It also uses at huge amount of disk, to host a lot of rpm that I'll probably never use. But hey, disk is cheap.
* configure some kind of nfs sharing between all hosts This works half well. Problem is that it is easy to delete the copy by accident.
I see. I'm also a bit worried what will happen if 2 computers decides to download the same file simultaneously.
* add squid to my firewall squid is not easy to setup, but once done it should be automatic, and only store those files actually used.
I have just tried to add squid as a transparent http proxy to my firewall (I use pfsense), actually it went quite smooth. But I have not verified that it works as planned yet. Luckily all repositories seems to be accessible using http. If/When that change, it will require some more work.
Some distributions have a system to create local caches automatically. We don't.
Yes at my work we use Redhat Satellite. I can be configured to host/maintain foreign repositories, but that is way overkill for my home use (and I guess not free/opensource either) -- Thanks for your insight. Klaus -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Sunday, 2018-08-05 at 22:37 +0200, Klaus Vink Slott wrote:
On 05-08-2018 21:31, Carlos E. R. wrote:
On 2018-08-05 17:36, Klaus Vink Slott wrote:
Hi ... Up to recently I had a unofficial mirror for base and update repositories. A simple cron job every night used wget to sync my local repository. By configuring my vm's to use this local repository I avoided downloading the same file again for each vm ... Now I am looking to replace my local mirror with something more flexible, so suggestions, best practice, pointers to examples is welcome. I not sure, but think that I'v seen a article on some Opensuse site, describing a solution based on nfs mounting the download directory on all hosts.
Yes, that is what I do, but your system is better.
If you want to use it, I'll explain how it goes. The advantage is that it only stores what is used, and older versions of packages that were used before.
Thanks. If you dont mind, I would like to keep that offer open, while I investigate using squid.
Currently I see 3 possibilities: * continue to host a local mirror of selected repositories This works well, but has to store whole repositories.
Yes and it also requires some monitoring to insure that the dl did not fail, that the mirror you are using is current, ect. It also uses at huge amount of disk, to host a lot of rpm that I'll probably never use. But hey, disk is cheap.
Right.
* configure some kind of nfs sharing between all hosts This works half well. Problem is that it is easy to delete the copy by accident.
I see. I'm also a bit worried what will happen if 2 computers decides to download the same file simultaneously.
Oh, yes. So far I managed to not try.
* add squid to my firewall squid is not easy to setup, but once done it should be automatic, and only store those files actually used.
I have just tried to add squid as a transparent http proxy to my firewall (I use pfsense), actually it went quite smooth. But I have not verified that it works as planned yet. Luckily all repositories seems to be accessible using http. If/When that change, it will require some more work.
The problem is that the download place at opensuse actually redirects to mirrors all over the world, thus a cache will be confused. It is possible that a second machine gets a different mirror, and thus downloads again the package.
Some distributions have a system to create local caches automatically. We don't.
Yes at my work we use Redhat Satellite. I can be configured to host/maintain foreign repositories, but that is way overkill for my home use (and I guess not free/opensource either)
I don't remember the name of the package or service, but I understood it was open. Debian perhaps? - -- Cheers, Carlos E. R. (from openSUSE 42.3 x86_64 "Malachite" at Telcontar) -----BEGIN PGP SIGNATURE----- Version: GnuPG v2 iEYEARECAAYFAltnabsACgkQtTMYHG2NR9VmBgCgk5MgZ1yTaXD6t2QXw84SHqLJ vWAAn1agfeGXj20hQlWMKnIkB+T52uLx =XSXY -----END PGP SIGNATURE----- -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
Klaus Vink Slott wrote:
On 05-08-2018 21:31, Carlos E. R. wrote:
* configure some kind of nfs sharing between all hosts This works half well. Problem is that it is easy to delete the copy by accident.
I see. I'm also a bit worried what will happen if 2 computers decides to download the same file simultaneously.
No problem in that.
* add squid to my firewall squid is not easy to setup, but once done it should be automatic, and only store those files actually used.
I have just tried to add squid as a transparent http proxy to my firewall (I use pfsense), actually it went quite smooth. But I have not verified that it works as planned yet. Luckily all repositories seems to be accessible using http. If/When that change, it will require some more work.
It won't work as expected :-) zypper uses segmented downloads. Each rpm is split into 256K chunks which are then downloaded in parallel from multiple mirrors. It can be made to work perfectly well, but it takes a little more effort. http://wiki.jessen.ch/index/How_to_cache_openSUSE_repositories_with_Squid -- Per Jessen, Zürich (23.1°C) http://www.cloudsuisse.com/ - your owncloud, hosted in Switzerland. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Monday, 2018-08-06 at 08:43 +0200, Per Jessen wrote:
Klaus Vink Slott wrote:
On 05-08-2018 21:31, Carlos E. R. wrote:
* configure some kind of nfs sharing between all hosts This works half well. Problem is that it is easy to delete the copy by accident.
I see. I'm also a bit worried what will happen if 2 computers decides to download the same file simultaneously.
No problem in that.
No, he is talking of "nfs rpm sharing", a method I invented, sort of. In this setup two machines downloading the same rpm is a no-no. Basically, the directory "/var/cache/zypp/packages" is shared directly by several machines via NFS. Zypper assumes the directory is his and only his, so we are tricking him. It. - -- Cheers, Carlos E. R. (from openSUSE 42.3 x86_64 "Malachite" at Telcontar) -----BEGIN PGP SIGNATURE----- Version: GnuPG v2 iEYEARECAAYFAltoLMUACgkQtTMYHG2NR9WzigCghn0hc0f017zpioEnjKhwtESX GNoAnRWKl/AsrUs/9OB7vAi1Ndfi3MCj =i9u7 -----END PGP SIGNATURE----- -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Monday, 2018-08-06 at 13:11 +0200, Carlos E. R. wrote:
On Monday, 2018-08-06 at 08:43 +0200, Per Jessen wrote:
Klaus Vink Slott wrote:
On 05-08-2018 21:31, Carlos E. R. wrote:
* configure some kind of nfs sharing between all hosts This works half well. Problem is that it is easy to delete the copy by accident.
I see. I'm also a bit worried what will happen if 2 computers decides to download the same file simultaneously.
No problem in that.
No, he is talking of "nfs rpm sharing", a method I invented, sort of. In this setup two machines downloading the same rpm is a no-no.
Basically, the directory "/var/cache/zypp/packages" is shared directly by several machines via NFS. Zypper assumes the directory is his and only his, so we are tricking him. It.
Also, repositories are configured to keep the downloaded packages after installing them, which is not the default. If one of the machines forgets this, all the packages are deleted and nothing is shared. - -- Cheers, Carlos E. R. (from openSUSE 42.3 x86_64 "Malachite" at Telcontar) -----BEGIN PGP SIGNATURE----- Version: GnuPG v2 iEYEARECAAYFAltoLcAACgkQtTMYHG2NR9XZywCgi40N+tmxzdFAE9b6ATtbgWlw B70AnjThXidmS5SadWCTNkZSKAJBTE5j =IkQS -----END PGP SIGNATURE----- -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
Carlos E. R. wrote:
On Monday, 2018-08-06 at 08:43 +0200, Per Jessen wrote:
Klaus Vink Slott wrote:
On 05-08-2018 21:31, Carlos E. R. wrote:
* configure some kind of nfs sharing between all hosts This works half well. Problem is that it is easy to delete the copy by accident.
I see. I'm also a bit worried what will happen if 2 computers decides to download the same file simultaneously.
No problem in that.
No, he is talking of "nfs rpm sharing", a method I invented, sort of. In this setup two machines downloading the same rpm is a no-no.
Basically, the directory "/var/cache/zypp/packages" is shared directly by several machines via NFS. Zypper assumes the directory is his and only his, so we are tricking him. It.
Oh, I thought it was just accessing a local repo copy via NFS. Sorry. -- Per Jessen, Zürich (29.5°C) http://www.dns24.ch/ - free dynamic DNS, made in Switzerland. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Monday, 2018-08-06 at 13:37 +0200, Per Jessen wrote:
Carlos E. R. wrote:
On Monday, 2018-08-06 at 08:43 +0200, Per Jessen wrote:
Klaus Vink Slott wrote:
On 05-08-2018 21:31, Carlos E. R. wrote:
* configure some kind of nfs sharing between all hosts This works half well. Problem is that it is easy to delete the copy by accident.
I see. I'm also a bit worried what will happen if 2 computers decides to download the same file simultaneously.
No problem in that.
No, he is talking of "nfs rpm sharing", a method I invented, sort of. In this setup two machines downloading the same rpm is a no-no.
Basically, the directory "/var/cache/zypp/packages" is shared directly by several machines via NFS. Zypper assumes the directory is his and only his, so we are tricking him. It.
Oh, I thought it was just accessing a local repo copy via NFS. Sorry.
And I did not know one could access a local repo copy via nfs, either. - -- Cheers, Carlos E. R. (from openSUSE 42.3 x86_64 "Malachite" at Telcontar) -----BEGIN PGP SIGNATURE----- Version: GnuPG v2 iEYEARECAAYFAltoOP8ACgkQtTMYHG2NR9UlBACgmHOJLiJmBis16MtU3dMuqNEj dn8An3BDSGzfTVllbNrnrpyDEubMM9f/ =7Z2b -----END PGP SIGNATURE----- -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
Carlos E. R. wrote:
On Monday, 2018-08-06 at 13:37 +0200, Per Jessen wrote:
Carlos E. R. wrote:
On Monday, 2018-08-06 at 08:43 +0200, Per Jessen wrote:
Klaus Vink Slott wrote:
On 05-08-2018 21:31, Carlos E. R. wrote:
> * configure some kind of nfs sharing between all hosts This works half well. Problem is that it is easy to delete the copy by accident.
I see. I'm also a bit worried what will happen if 2 computers decides to download the same file simultaneously.
No problem in that.
No, he is talking of "nfs rpm sharing", a method I invented, sort of. In this setup two machines downloading the same rpm is a no-no.
Basically, the directory "/var/cache/zypp/packages" is shared directly by several machines via NFS. Zypper assumes the directory is his and only his, so we are tricking him. It.
Oh, I thought it was just accessing a local repo copy via NFS. Sorry.
And I did not know one could access a local repo copy via nfs, either.
TBH, I have never tried it, but I assume it works because installation via NFS has been supported for years. -- Per Jessen, Zürich (29.8°C) http://www.dns24.ch/ - your free DNS host, made in Switzerland. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
On 06-08-2018 08:43, Per Jessen wrote:
Klaus Vink Slott wrote:
On 05-08-2018 21:31, Carlos E. R. wrote:
* configure some kind of nfs sharing between all hosts This works half well. Problem is that it is easy to delete the copy by accident. I see. I'm also a bit worried what will happen if 2 computers decides to download the same file simultaneously. No problem in that.
As Carlos already pointed out, I was a bit brief in my description of the NFS solution.
* add squid to my firewall squid is not easy to setup, but once done it should be automatic, and only store those files actually used.
I have just tried to add squid as a transparent http proxy to my firewall (I use pfsense) ... It won't work as expected :-) Damm.. :-) Things often looks more simple from the outside.
zypper uses segmented downloads. Each rpm is split into 256K chunks which are then downloaded in parallel from multiple mirrors. That is quite a clever, but of course that also makes my simple fix worthless.
It can be made to work perfectly well, but it takes a little more effort
http://wiki.jessen.ch/index/How_to_cache_openSUSE_repositories_with_Squid Thanks for the link, I will dig into your blogpost and decide which path to follow. I kind of like the idea that using squid I do not need to change anything on the target.
-- Regards Klaus -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
Klaus Vink Slott wrote:
On 06-08-2018 08:43, Per Jessen wrote:
It can be made to work perfectly well, but it takes a little more effort
http://wiki.jessen.ch/index/How_to_cache_openSUSE_repositories_with_Squid
Thanks for the link, I will dig into your blogpost and decide which path to follow. I kind of like the idea that using squid I do not need to change anything on the target.
Yes, that was also our main thinking - to install, just plug in a USB or a DVD, no need to change anything. Let me know if you need any help, my setup is still running. -- Per Jessen, Zürich (26.9°C) http://www.dns24.ch/ - free dynamic DNS, made in Switzerland. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
Hello Klaus, With regards to your use case of home lab testing, apt-cacher-ng might be worth looking into. I have used acng only in front of debian based systems but apparently there is RPM repository support as well: https://www.unix-ag.uni-kl.de/~bloch/acng/html/distinstructions.html#hint-su... Jeff Kowalczyk -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
Klaus Vink Slott wrote:
Hi
I do a lot of testing at home and experimenting so I often have something like 10 virtual installs with opensuse running. Up to recently I had a unofficial mirror for base and update repositories. A simple cron job every night used wget to sync my local repository. By configuring my vm's to use this local repository I avoided downloading the same file again for each vm, during repeated install experiments and system updates.
Now I am looking to replace my local mirror with something more flexible, so suggestions, best practice, pointers to examples is welcome. I not sure, but think that I'v seen a article on some Opensuse site, describing a solution based on nfs mounting the download directory on all hosts.
Currently I see 3 possibilities:
* continue to host a local mirror of selected repositories * configure some kind of nfs sharing between all hosts * add squid to my firewall
each has pros and cons as I see it.
Option 1 and 2 only differ in the access method - http:// or nfs://. I have used both 1 and 3. Setting up squid was the most complex, but once done it is fully independent & transparent. We have since switched to using a local mirror (as we also host a public mirror). There is no difference between the two except in amount of effort in setting it up. -- Per Jessen, Zürich (23.1°C) http://www.dns24.ch/ - your free DNS host, made in Switzerland. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
Klaus Vink Slott wrote:
Currently I see 3 possibilities:
* continue to host a local mirror of selected repositories * configure some kind of nfs sharing between all hosts * add squid to my firewall
I'm also running a mirror, updated every night via rsync (check which mirrors support that). The repo is exported from the server to ro-mount from any host in our network, and normal machines do this via automount. Zypper can automount itself, if you prefer that: baseurl=nfs://<server_ip>/export/opensuse/tumbleweed/repo/oss?mountoptions=ro I also sync this repo to an external SDD, for use at home where I'm only having a 6Mbit line. There I have repos configured as, e.g., [openSUSE-Tumbleweed] name=openSUSE-Tumbleweed enabled=1 autorefresh=0 baseurl=hd:///tumbleweed/repo/oss?device=/dev/disk/by-label/Leap-repos&filesystem=auto http://download.opensuse.org/tumbleweed/repo/oss/ path=/ type=rpm-md keeppackages=0 A 'zypper ref' will use the first match it finds, i.e., the disk if connected, else (if I forgot the disk at the office...) use the net... -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
Peter Suetterlin wrote:
Klaus Vink Slott wrote:
Currently I see 3 possibilities:
* continue to host a local mirror of selected repositories * configure some kind of nfs sharing between all hosts * add squid to my firewall
I'm also running a mirror, updated every night via rsync (check which mirrors support that).
There is really little need to seek out a specific mirror, we offer this via rsync.o.o. I only just yesterday spent a couple of hours tidying it up, we were running out of space. -- Per Jessen, Zürich (28.9°C) member, openSUSE Heroes. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
Per Jessen wrote:
There is really little need to seek out a specific mirror, we offer this via rsync.o.o. I only just yesterday spent a couple of hours tidying it up, we were running out of space.
Good to know (and thanks for your work :) ). I've hardcoded the gwdg mirror in my rsync cron, as it's the fastest I found (I'm connected to GEANT2 here), the automatic redirection via o.o often gives me closer (but slower) mirrors, at least for the http(s) connections.... -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
Peter Suetterlin wrote:
Per Jessen wrote:
There is really little need to seek out a specific mirror, we offer this via rsync.o.o. I only just yesterday spent a couple of hours tidying it up, we were running out of space.
Good to know (and thanks for your work :) ). I've hardcoded the gwdg mirror in my rsync cron, as it's the fastest I found (I'm connected to GEANT2 here), the automatic redirection via o.o often gives me closer (but slower) mirrors, at least for the http(s) connections....
I was just about to add something about speed of a mirror - rsync.o.o is in Nürnberg, it might be slow(ish) in more remote parts of the world. As for gwdg, yes, that one is FAST. -- Per Jessen, Zürich (29.5°C) http://www.cloudsuisse.com/ - your owncloud, hosted in Switzerland. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
On 08/05/2018 10:36 AM, Klaus Vink Slott wrote:
Currently I see 3 possibilities:
* continue to host a local mirror of selected repositories * configure some kind of nfs sharing between all hosts * add squid to my firewall
4) Create your own custom local repo holding only the files installed on your system. By far my favorite. With a couple of helper scripts and a cron job, just rsync all local packages from each machine to your local repo (somewhere accessible by http across your lan). rsync will insure only one copy of the rpm is placed in your local repo. The you simply call 'createrepo' and build the metadata for your local repo on your server. It worked fantastic, but requires you have some way of serving the repository (web server, etc..) It also requires you disable download of delta rpm and enable keeppackages on your local computers. (your cron job can sweep each box clean after transferring the files to your local server). Your repo can be a formal, signed, etc.. as you want and the helper scripts simply remove old versions, etc.. and rebuild the metadata after each change. I would use this over either of the 1-3 options as this allows you to only host the files used by your system that were downloaded once in the normal course of your installs/update. -- David C. Rankin, J.D.,P.E. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
David C. Rankin wrote:
On 08/05/2018 10:36 AM, Klaus Vink Slott wrote:
Currently I see 3 possibilities:
* continue to host a local mirror of selected repositories * configure some kind of nfs sharing between all hosts * add squid to my firewall
4)
Create your own custom local repo holding only the files installed on your system. By far my favorite. With a couple of helper scripts and a cron job, just rsync all local packages from each machine to your local repo (somewhere accessible by http across your lan). rsync will insure only one copy of the rpm is placed in your local repo. The you simply call 'createrepo' and build the metadata for your local repo on your server.
Interesting option, I'll have to try it out.
I would use this over either of the 1-3 options as this allows you to only host the files used by your system that were downloaded once in the normal course of your installs/update.
The squid option does that too. -- Per Jessen, Zürich (17.9°C) http://www.cloudsuisse.com/ - your owncloud, hosted in Switzerland. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org
participants (6)
-
Carlos E. R.
-
David C. Rankin
-
Jeff Kowalczyk
-
Klaus Vink Slott
-
Per Jessen
-
Peter Suetterlin