Please show me the best/fast methods for:
1) Loading very small binary files into memory. For example icons;
2) Loading/reading very big binary files of size 512Mb+.
3) Your common choice when you do not want to think about size/speed but must do only thing: read all bytes into memory?
Thank you!!!
P.S. Sorry for maybe trivial question. Please do not close it;)
P.S.2. Mirror of analog question for Java;
1: For very small files File.ReadAllBytes will be fine.
2: For very big files and using .net 4.0 , you can make use MemoryMapped Files.
3: If Not using .net 4.0 than , reading chunks of data would be good choice
1) I'd use a resource file rather than storing it as lots of separate files.
2) you probably want to stream the data rather than read it all at once, in which case you can use a FileStream.
3): Use ReadAllBytes:
byte[] bytes = File.ReadAllBytes(path);
1: For small, File.ReadAllBytes
2: For big, Stream (FileStream) or a BinaryReader on a Stream - the purpose being to remove the need to allocate a massive buffer, by changing the code to read small chunks consecutively
3: Go back and find the expected size; default to worst-case (#2)
Also note that I'd try to minimise the siE in the first place, perhaps via the choice of data-format, or compression.
This sample is good for both - for large files you need buffered reads.
public static byte[] ReadFile(string filePath)
{
byte[] buffer;
FileStream fileStream = new FileStream(filePath, FileMode.Open, FileAccess.Read);
try
{
int length = (int)fileStream.Length; // get file length
buffer = new byte[1024]; // create buffer
int count; // actual number of bytes read
int sum = 0; // total number of bytes read
// read until Read method returns 0 (end of the stream has been reached)
while ((count = fileStream.Read(buffer, sum, length - sum)) > 0)
sum += count; // sum is a buffer offset for next reading
}
finally
{
fileStream.Close();
}
return buffer;
}