On 23 May 2006 at 13:30, Andreas Jaeger wrote:
Frank-Michael Fischer <fmfischer(a)gmx.net>
My first "rug sl" command took about 8
mins to complete. The next one
was lightening fast.
Yes, it needs to parse all the stuff and then the daemon has
Into RAM? If so, wouldn't it preferrable to use some light-weight & fast database
library like Sleepycat (now: Oracle) to store meta-data instead of XML (on disk)?
That way, the data structure would be ready almost immediately. I could imagine
either shipping the metadata database as CD image, or as an "importable"
I can imagine (juast a wild guess) that even BerkeleyDB mounted over NFS isn't
much slower than XML-parsing the same data after downloading them. (If you are
asking about incremental updates)
8 minutes is still to long and therefore I would like to see which
repositories you use (output of "rug sl") to get a feeling whether
there's a reason for it. It needs roughly two minutes on my system,
How much virtual memory does it need then? >512MB?