Listmates, A couple of questions. In c, I am working on a small program to read deposition transcripts (text files) into a linked list and then step through the list to classify the lines, search, etc. What I would like to know is how do I tell how much memory the program uses? The list structure is basically: struct record { char *line; int lineno; int linetype; int pageno, lineno, recordno; struct record *next; }; The file is read with lines length determined by "length = getline (&lineptr, &n, depo)" with the char *line allocated as length+1 with malloc. Is there a way to determine how much memory I'm using besides doing some kind of sum on each node in the list where memory for each node would be something like ( length*char + int + int + int + int + int + pointer )? Or, could I just take something like the text file size on disk + number of structures * ( int + int + int + int + int + pointer )? Also, more generally, for searching and working on the text, would just all reading the text into something other than a linked list be better? I guess I could use one big buffer. The text files for each single transcript are rarely ever more than 500K. Eventually, I would like to have the ability to search through any number of transcripts, but they need not all be in memory at the same time (although for speed that would help). Any text handling favorite data structures? Next, and more importantly... who has a good link to a gdb tutorial? (I have found a few, but I would love to have a good one recommended). Thanks. -- David C. Rankin, J.D.,P.E. Rankin Law Firm, PLLC 510 Ochiltree Street Nacogdoches, Texas 75961 Telephone: (936) 715-9333 Facsimile: (936) 715-9339 www.rankinlawfirm.com -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org For additional commands, e-mail: opensuse+help@opensuse.org