In my application I am gathering small data bits into a bigger file over time. As a result, the target file becomes excessively fragmented. What can be done to limit fragmentation of the output file with .NET?
相关问题
- Sorting 3 numbers without branching [closed]
- Graphics.DrawImage() - Throws out of memory except
- Generic Generics in Managed C++
- Why am I getting UnauthorizedAccessException on th
- 求获取指定qq 资料的方法
The only way you can reliably reduce file fragmentation is to allocate space for the complete file in one go (fill the rest of the file with zeros, as filler). But this can only be done if you have some idea of the final size of the file.
Another option would be to increase the file size by a certain value during each increment (say, 100 MB). That way you would have lesser defragmented chunks.
You could deliberately increment the file size in larger chunks and internally monitor where the end of your current usage is. That way you give the system a better chance of allocating contiguous space for your file.