Anders Johansson said the following on 02/15/2010 02:04 PM:
On Monday 15 February 2010 18:02:14 Lubos Lunak wrote:
On Monday 15 of February 2010, James Knott wrote:
Lubos Lunak wrote:
As demonstrated also in this thread, there is a widely accepted myth
that defragmenting is completely useless with Linux, and as such nobody has been really bothered enough to write any reasonably usable generic tool.
Given that modern file systems are fragmentation resistant, please explain how fragmentation is a problem on Linux.
I don't think there is a defragmentation tool in the world that knows which files are read together, and in which order that happens. Defragmentation invariably refers to files being discontiguous on the disk. Anything else is called "optimisation" and as far as I know, it requires human interaction, and knowledge of the process being optimised
Correct. I recall seeing runs of the MS defrag tool, such as the one I ran on this laptop before shrinking the partition way, way down and putting Suse on the rest of the drive. it just packed everything down tight. Which is the wrong way to defrag. it means once a file is altered the defrag is 'broken'. A good strategy is to leave gaps for the files to expand into. But you never know how much the file may need to expand or which ones are going to expand when the automated process packs them down. Which gets to Ander's point about human guidance. Which may be wrong. I *think* that the system files shouldn't need alternation, but oops! there's a patch and oops! there's an upgrade. And oops that file in /home actually hasn't been changed for over 3 years ... it seems my strategy was wrong ... what a pity! Somehow I don't think that kind of defrag does what you think it does. So, back to the idea of things like cylinder groups and locality; if we can't guarantee contiguous blocks layout then at least lets keep head motion down. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org For additional commands, e-mail: opensuse+help@opensuse.org