RE: [SLE] On-demand caching local mirror?
-----Original Message----- From: Stephen Boddy [mailto:stephen.boddy@btinternet.com] Sent: Wednesday, July 12, 2006 10:21 PM To: suse-linux-e@suse.com Subject: [SLE] On-demand caching local mirror?
I'm wondering if anyone can come up with an answer. I'm after a piece of software that will "mirror" an ftp directory (as an example) locally, but will not retrieve the files unless read. I'm sort of thinking of a FUSE sort of local filesystem.
i.e. The SUSE updates repository. 1. Set up a local folder. 2. Get the remote file list. 3. An app requests a file. 4. If in local cache, return file, goto 3 5. If not local, and if connection closed, reopen. 6. Try to retrieve, then return file, goto 3 (every so often (on mount, once/day) repeat 2)
The point being that I have multiple machines I want to update, so I don't want to waste bandwidth by a) downloading the kde updates repeatedly, or b) replicating the whole folder with updates for packages I haven't installed.
It seems like this would be a sensible addition to SUSE for those of us with multiple machines, (applying to virtual ones too.) I've googled, and gone through lots of pages, but I can't find something that operates like
The closest I seen so far is fuseftp. The two problems I can see with
this. this
are 1. I think the cache expiry time is specified in seconds, and I can't see if there is an indefinate (i.e. passing 0) 2. I think it holds the ftp connection open while it is mounted, which wouldn't be good for the servers, with people holding connections open.
Any suggestions?
I think it sounds like a great idea, personally... Given the lack of that particular functionality, I think I might try something like this, though... `cd /srv/www/htdocs; rsync -arv --delete --exclude=*.ppc* rsync://suse.mirrors.tds.net/suse/update; chkconfig apache2 on; /etc/init.d/apache2 start` And then point at that machine as your update source, from everywhere else. (and if you didn't want to share it out via http, ftp or nfs or....) It does mean that you need to download everything; but you only have to do it once. And keeping your update tree up-to-date is just a matter of croning the rsync to run once a day or so. (One note: I do this, because I have 50 or 60 machines to keep up to date, running a bunch of different suse versions. If you're just looking to keep a few, like 3 or 5 or something, updated, then you probably want to use the public update sites per usual -- rsync can be a heavy protocol, and using it unnecessarily is generally considered to be a Bad Thing.) -- Check the headers for your unsubscription address For additional commands send e-mail to suse-linux-e-help@suse.com Also check the archives at http://lists.suse.com Please read the FAQs: suse-linux-e-faq@suse.com
On Thursday 13 July 2006 15:37, Marlier, Ian wrote:
I think it sounds like a great idea, personally...
Given the lack of that particular functionality, I think I might try something like this, though...
`cd /srv/www/htdocs; rsync -arv --delete --exclude=*.ppc* rsync://suse.mirrors.tds.net/suse/update; chkconfig apache2 on; /etc/init.d/apache2 start`
And then point at that machine as your update source, from everywhere else. (and if you didn't want to share it out via http, ftp or nfs or....)
It does mean that you need to download everything; but you only have to do it once. And keeping your update tree up-to-date is just a matter of croning the rsync to run once a day or so.
Hmm. I did state that I didn't want to d/l the whole tree. I have done this in the past, but I end up downloading lots of unnecessary packages.
(One note: I do this, because I have 50 or 60 machines to keep up to date, running a bunch of different suse versions. If you're just looking to keep a few, like 3 or 5 or something, updated, then you probably want to use the public update sites per usual -- rsync can be a heavy protocol, and using it unnecessarily is generally considered to be a Bad Thing.)
Yes, I agree that for medium - large sites, this makes sense. The savings of multiple access outweigh the overhead of d/l'ing the whole tree. For small sites though, I don't want to waste resource d/l'ing packages 3+ times. I think you might find that the 3-5 machine sites outweigh the 50+ machine sites. As I mention in another post in this thread, I've found a package which seems to address a lot of this. It's not complete from my initial investigation, but I could possibly whip it into shape and use that. Something like this would save significant bandwidth on the servers. -- Steve Boddy -- Check the headers for your unsubscription address For additional commands send e-mail to suse-linux-e-help@suse.com Also check the archives at http://lists.suse.com Please read the FAQs: suse-linux-e-faq@suse.com
On Thursday 13 July 2006 16:34, Stephen Boddy wrote:
As I mention in another post in this thread, I've found a package which seems to address a lot of this. It's not complete from my initial investigation, but I could possibly whip it into shape and use that. Something like this would save significant bandwidth on the servers.
Do remember to report back on progress (or lack of it) - it would be useful for others. -- Pob hwyl / Best wishes Kevin Donnelly www.kyfieithu.co.uk - KDE yn Gymraeg www.eurfa.org.uk - Geiriadur rhydd i'r Gymraeg www.rhedadur.org.uk - Rhedeg berfau Cymraeg www.cymrux.org.uk - Linux Cymraeg ar un CD -- Check the headers for your unsubscription address For additional commands send e-mail to suse-linux-e-help@suse.com Also check the archives at http://lists.suse.com Please read the FAQs: suse-linux-e-faq@suse.com
On Thursday 13 July 2006 22:17, Kevin Donnelly wrote:
On Thursday 13 July 2006 16:34, Stephen Boddy wrote:
As I mention in another post in this thread, I've found a package which seems to address a lot of this. It's not complete from my initial investigation, but I could possibly whip it into shape and use that. Something like this would save significant bandwidth on the servers.
Do remember to report back on progress (or lack of it) - it would be useful for others.
Well I've already e-mailed the original author to clarify what is permitted, as there is no license mentioned or in the archive. I'd like to re-use, but if he doesn't respond, I can't just rip off his work. This means I have to come up with my own code, which makes the job much bigger. Although there are advantages to a wholly new design, in that I have a better understanding of the design, and how it all hangs together. I'll report back in due course. -- Steve Boddy -- Check the headers for your unsubscription address For additional commands send e-mail to suse-linux-e-help@suse.com Also check the archives at http://lists.suse.com Please read the FAQs: suse-linux-e-faq@suse.com
On Thursday 13 July 2006 22:17, Kevin Donnelly wrote:
On Thursday 13 July 2006 16:34, Stephen Boddy wrote:
As I mention in another post in this thread, I've found a package which seems to address a lot of this. It's not complete from my initial investigation, but I could possibly whip it into shape and use that. Something like this would save significant bandwidth on the servers.
Do remember to report back on progress (or lack of it) - it would be useful for others.
OK, I never heard back from the guy, so that is dead. I also moved to smart now, so this is now a dead issue. -- Steve Boddy
participants (3)
-
Kevin Donnelly
-
Marlier, Ian
-
Stephen Boddy