On Mon, 2010-02-15 at 18:02 +0100, Lubos Lunak wrote:
On Monday 15 of February 2010, James Knott wrote:
Lubos Lunak wrote:
As demonstrated also in this thread, there is a widely accepted myth that defragmenting is completely useless with Linux, and as such nobody has been really bothered enough to write any reasonably usable generic tool.
Given that modern file systems are fragmentation resistant, please explain how fragmentation is a problem on Linux.
"Ok, first of all, talking about defragmenting is actually wrong. Defragmenting is making sure no file is fragmented, i.e. that every file is just one contiguous area of the disk. But do you know any today's application that reads just one file? The thing that should be talked instead should be linearizing, i.e. making sure that related files (not one, files) are one contiguous area of the disk." But you get into linearization of files to be read, which to me seems like a different ball game all together, not to mention a logistic nightmare of a sort. If you have several heavily used apps that read 20 of the same files at startup, but the other 80 files are generally different, linearizing this would seem to either require multiple copies of the same files, or biasing the order due to the most frequent app used, or the most preferred app used. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org For additional commands, e-mail: opensuse+help@opensuse.org