About Fragmentation

Posted by: Thom Denholm

Do you need defrag? It mostly depends on your hardware and your use case. While defragmenting a file system can make the computer run faster, it's not the only answer. Fragmentation is usually caused when modifying a file. Overwriting the file or making it larger means storing a fragment of the file in a new place, unless the file system creates a complete new copy of the file. Databases are particularly susceptible here - they are usually large files and often updated in the middle. Another way fragmentation happens is when the file system initially stores the file in pieces. This could happen if the file system is not configured to keep file blocks together, or if the media is fairly full and there are no spaces of sufficient size for the new file. What about the impact of fragmentation? In the days of rotating media, a fragmented file meant extra head movement and platter rotation to read the file. With flash media, the extra overhead is just additional block reads - a far smaller cost. Avoiding fragmentation if you're using Reliance Nitro can be as simple as customizing your transaction points. Instead of transacting on a timed basis, create a new transaction point only when the entire file is on the media, at "file close". Similar settings may be available on other file systems. If your use case causes fragmentation, a valid workaround might be to reformat the media after backing up the database files. A fresh file format is fairly quick on modern hardware, and can be coupled with a bad block test as well.

Read more about Reliance Nitro


Comments (1)

  1. garbage:
    Feb 10, 2011 at 11:31 PM

    I think that fragmatation in flash cause more 'garbage collection' and this will decrease of peformace. How do you think about this?


Add a Comment





Allowed tags: <b><i><br>Add a new comment: