var fsw = new FileSystemWatcher(sPath, "*.PPF");
fsw.NotifyFilter = NotifyFilters.FileName;
fsw.IncludeSubdirectories = true;
fsw.Created += FswCreated;
fsw.EnableRaisingEvents = true;
static void FswCreated(object sender, FileSystemEventArgs e)
{
string sFile = e.FullPath;
string[] arrLines = File.ReadAllLines(sFile);
}
this fails with large files, because the process is not finnished with writing the file. The file is copied via the Network, so i dont know the size of the file. What kind of syncronisation is required to make this robust?
Use the
DelayedFileSystemWatcher.cs
Class http://blogs.msdn.com/b/ahamza/archive/2006/02/06/526222.aspxand then this code. Check the
PrintFileSystemEventHandler
eventhandler. It tries to read the file in file stream and if any ioerror is giventhe it assumes the file is still reading so it waits for a interval(2 seconds in this example) and then tries again. Check theCONVERSION:
labelI dont have the link to the project. but this wil help.
Simply in your
fswCreated
, sleep for about 1/2 seconds withThread.Sleep(500)
if thats possible. That should give you the time the computer needs to finish writing the file.Of course, for slower hard drives, this may or may enough time.
I support the solution accepted by Shay Erlichmen. But however a) You may wan't to open the File with access mode FileAccess.Read, incase its a read only file
b) Some programs whilst downloading, the file will have some funny extension and when completed, the extension will change, and although file would have completed you will have file not found exception.
So handle the exceptions, and also subscribe to file.renamed event
Solution found on stackoverflow and modified it a bit.
AFAIK you don't get notified once the copy is done, you can implement a retry mechanism.
If you get a shearing violation just trigger a timer to retry the operation in X seconds.
The second retry should be after X*2 seconds and so on (with some limitation of course).