9 Mar
2005
9 Mar
'05
18:33
The Monday 2005-03-07 at 09:31 -0500, James Knott wrote:
One situation I was wondering about, was if you had a huge data file. Might it be easier to load the entire file into memory (real & virtual) and process it there, than by reading & writing blocks of the file as necessary?
Easier for the programmer, perhaps - provided he uses, while developing, a big enough computer or small enough test file :-p nmap() is used, I understand, for just that, copying the file into memory and working with it. As I'm not a linux programmer, I don't know if big files are loaded in full. I have read its man page, but I'm no wiser. -- Cheers, Carlos Robinson