I am relatively new to C# so please bear with me.
I am writing a business application (in C#, .NET 4) that needs to be reliable. Data will be stored in files. Files will be modified (rewritten) regularly, thus I am afraid that something could go wrong (power loss, application gets killed, system freezes, ...) while saving data which would (I think) result in a corrupted file. I know that data which wasn't saved is lost, but I must not lose data which was already saved (because of corruption or ...).
My idea is to have 2 versions of every file and each time rewrite the oldest file. Then in case of unexpected end of my application at least one file should still be valid.
Is this a good approach? Is there anything else I could do? (Database is not an option)
Thank you for your time and answers.
Rather than "always write to the oldest" you can use the "safe file write" technique of:
(Assuming you want to end up saving data to
foo.data
, and a file with that name contains the previous valid version.)foo.data.new
foo.data
tofoo.data.old
foo.data.new
tofoo.data
foo.data.old
At any one time you've always got at least one valid file, and you can tell which is the one to read just from the filename. This is assuming your file system treats rename and delete operations atomically, of course.
foo.data
andfoo.data.new
exist, loadfoo.data
;foo.data.new
may be broken (e.g. power off during write)foo.data.old
andfoo.data.new
exist, both should be valid, but something died very shortly afterwards - you may want to load thefoo.data.old
version anywayfoo.data
andfoo.data.old
exist, thenfoo.data
should be fine, but again something went wrong, or possibly the file couldn't be deleted.Alternatively, simply always write to a new file, including some sort of monotonically increasing counter - that way you'll never lose any data due to bad writes. The best approach depends on what you're writing though.
You could also use
File.Replace
for this, which basically performs the last three steps for you. (Pass innull
for the backup name if you don't want to keep a backup.)In principle there are two popular approaches to this:
or
The first leaves you with (way) more development effort, but also has the advantage of making saves go faster if you save small changes to large files (Word used to do this AFAIK).
A lot of programs uses this approach, but usually, they do more copies, to avoid also human error.
For example, Cadsoft Eagle (a program used to design circuits and printed circuit boards) do up to 9 backup copies of the same file, calling them file.b#1 ... file.b#9
Another thing you can do to enforce security is to hashing: append an hash like a CRC32 or MD5 at the end of the file. When you open it you check the CRC or MD5, if they don't match the file is corrupted. This will also enforce you from people that accidentally or by purpose try to modify your file with another program. This will also give you a way to know if hard drive or usb disk got corrupted.
Of course, faster the save file operation is, the less risk of loosing data you have, but you cannot be sure that nothing will happen during or after writing.
Consider that both hard drives, usb drives and windows OS uses cache, and it means, also if you finish writing the data may be OS or disk itself still didn't physically wrote it to the disk.
Another thing you can do, save to a temporary file, if everything is ok you move the file in the real destination folder, this will reduce the risk of having half-files.
You can mix all these techniques together.