-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 2010-07-25 17:18, Richard Creighton wrote:
On Sunday 25 July 2010 11:03:13 Carlos E. R. wrote:
On 2010-07-25 16:37, Richard Creighton wrote:
On Sunday 25 July 2010 07:09:04 Carlos E. R. wrote:
Check out Dave Rankins contribution:
Works great and using cron or other method, you can automate it completely. He put a lot of thought and effort into this package and it is well documented and allows all your issues to be addressed across multiple machines and versions including simultaneous updates. He deserves many thanks.
I think I have seen it. Creating a local repo?
Well, this is a different idea. There are many ways to skin a cat, as the saying goes :-)
Possibly the biggest advantage of this is that it contains every file you will ever need to reconstruct your system(s) even AFTER end of support from official repositories, like the recent demise of 11.0 and earlier, 10.3 which just 'dissapeared' leaving people with older hardware somewhat out in the cold if they lost a drive containing their OS and needed to rebuild it. Sure they could reinstall from the original DVD, but what about all the updates that no longer are available....having a local copy of the ones you had applied on hand is certainly handy, as is all of the stuff you had ever downloaded that was for your version of the OS and your machine. It also allows you to go back to an earlier version that might be out of date/support if the current version turns out to have a hidden uh-oh. Sure backups are important, but they rarely are 'bare-metal', usually concentrating on data not the OS itself. Dave's system saves space, is quick, saves time and is easy to back up and unllike a mirror of a repository, only contains what you actually need for your installation as you have installed it on any of your machines.
Replicating the oss, non-oss, and updates repos, even for only one architecture, takes days of downloading. I know because I did so.
I know, it is a good solution. It just needs space and good internet bandwidth (which I don't have). What I'm proposing is a different solution, with different requirements, and advantages, and problems.
Recently somebody asked for a solution to his similar problem. His bandwidth is even more limited than mine, because he pays per megabyte. What he and I want is just downloading once everything we need for perhaps two or three computers. And doing so with as little hassle as possible.
For this scenario a full local repo is impossible. The shared directories via nfs are workable, with some care.
It is also possible to build, picking the files already downloaded by each computer in the network, a local repo of whatever we have, even incomplete. But there is no way I know that YaST could pick packages from that incomplete repo and what is missing from a remote.
- -- Cheers / Saludos,
Carlos E. R. (from 11.2 x86_64 "Emerald" GM (Elessar))