I am refactoring some code an have a question that i could use a few comments on.
The original code download a file to a stream. Then it writes the stream to a File in a temp directory before using File.Copy to overwrite an existing file in the production directory.
Are there any benefits from writing it to the temp dir first and using the File.Copy in contrast to just writing the stream to the production directory right away?
One reason could be that the File.Copy is faster then writing a stream, and reducing the chance that someone is reading the file while its being written. But can that even happen? What else should I have in mind. I am considering factoring out the temp directory.
MemoryStream stream = new MemoryStream();
....Download and valiate stream....
using (Stream sourceFileStream = stream)
{
using (FileStream targetFileStream = new FileStream(tempPath, FileMode.CreateNew))
{
const int bufferSize = 8192;
byte[] buffer = new byte[bufferSize];
while (true)
{
int read = sourceFileStream.Read(buffer, 0, bufferSize);
targetFileStream.Write(buffer, 0, read);
if (read == 0)
break;
}
}
}
File.Copy(tempPath, destination, true);
in contrast to just writing the stream to destination.
This is just the code I had, i would properly use something like sourceFileStream.CopyToAsync(TargetFileStream);
well, think about what happens when you start the download and override the existing file and then for some reason the download gets aborted, you'd be left with an broken file. however, downloading it first in another location and copying it to the target directory factors that problem out.
EDIT: okay, seeing the code now. if the file is already in the MemoryStream there's really no reason to write the file to a temp location and copy it over. you could just just do
File.WriteAllBytes(destination,stream.ToArray());
It is a better practice to assemble the file stream of bytes into an isolated location and only after it is assembled to copy it to production area for the following reasons:
Assume a power shortage during the assemble phase, when it is in isolated folder such as 'temp' you just end up with partial file which you can recheck and ignore later on.. however if you directly assemble the file on production and a power shortage occur - next time you turn on your application you will need to check integrity of all files which are not 'static' to your app.
If the file is being used on production and 'being-assembled' new file is long - you end up having your user wait for the process of assembling to complete, However when the process of assembling the buffers is as in your example - simply copying the ready file will inflict shorter waiting period for your user.
Assume the disk space is full... same problem as in #1.
Assume Another process in your application has a memory leak and the application is crushed out before completing the assemble process.
Assume [fill in your disaster case]...
So, yes. it is better practice to do as in your example, but the real question is how important is this file to your application? is it just another data file like a 'save-game' file or is it something that can crush your application if invalid?
File.Copy
simply encapsulates the usage of streams, etc. No difference at all.