"Ulrich Windl" <ulrich.windl@rz.uni-regensburg.de> writes:
On 23 May 2006 at 13:30, Andreas Jaeger wrote:
Frank-Michael Fischer <fmfischer@gmx.net> writes:
My first "rug sl" command took about 8 mins to complete. The next one was lightening fast.
Yes, it needs to parse all the stuff and then the daemon has everything loaded.
Into RAM? If so, wouldn't it preferrable to use some light-weight & fast database library like Sleepycat (now: Oracle) to store meta-data instead of XML (on disk)?
It lives in a sqlite database but you still need to parse it and store it in the database.
That way, the data structure would be ready almost immediately. I could imagine either shipping the metadata database as CD image, or as an "importable" database dump. I can imagine (juast a wild guess) that even BerkeleyDB mounted over NFS isn't much slower than XML-parsing the same data after downloading them. (If you are asking about incremental updates)
8 minutes is still to long and therefore I would like to see which repositories you use (output of "rug sl") to get a feeling whether there's a reason for it. It needs roughly two minutes on my system,
How much virtual memory does it need then? >512MB?
< 256 MB on my i386 system. Andreas -- Andreas Jaeger, aj@suse.de, http://www.suse.de/~aj/ SUSE LINUX Products GmbH, Maxfeldstr. 5, 90409 Nürnberg, Germany GPG fingerprint = 93A3 365E CE47 B889 DF7F FED1 389A 563C C272 A126