I am creating a file of a specified size - I don't care what data is in it, although random would be nice. Currently I am doing this:
var sizeInMB = 3; // Up to many Gb
using (FileStream stream = new FileStream(fileName, FileMode.Create))
{
using (BinaryWriter writer = new BinaryWriter(stream))
{
while (writer.BaseStream.Length <= sizeInMB * 1000000)
{
writer.Write("a"); //This could be random. Also, larger strings improve performance obviously
}
writer.Close();
}
}
This isn't efficient or even the right way to go about it. Any higher performance solutions?
Thanks for all the answers.
Edit
Ran some tests on the following methods for a 2Gb File (time in ms):
Method 1: Jon Skeet
byte[] data = new byte[sizeInMb * 1024 * 1024];
Random rng = new Random();
rng.NextBytes(data);
File.WriteAllBytes(fileName, data);
N/A - Out of Memory Exception for 2Gb File
Method 2: Jon Skeet
byte[] data = new byte[8192];
Random rng = new Random();
using (FileStream stream = File.OpenWrite(fileName))
{
for (int i = 0; i < sizeInMB * 128; i++)
{
rng.NextBytes(data);
stream.Write(data, 0, data.Length);
}
}
@1K - 45,868, 23,283, 23,346
@128K - 24,877, 20,585, 20,716
@8Kb - 30,426, 22,936, 22,936
Method 3 - Hans Passant (Super Fast but data isn't random)
using (var fs = new FileStream(fileName, FileMode.Create, FileAccess.Write, FileShare.None))
{
fs.SetLength(sizeInMB * 1024 * 1024);
}
257, 287, 3, 3, 2, 3 etc.
Well, a very simple solution:
byte[] data = new byte[sizeInMb * 1024 * 1024];
Random rng = new Random();
rng.NextBytes(data);
File.WriteAllBytes(fileName, data);
A slightly more memory efficient version :)
// Note: block size must be a factor of 1MB to avoid rounding errors :)
const int blockSize = 1024 * 8;
const int blocksPerMb = (1024 * 1024) / blockSize;
byte[] data = new byte[blockSize];
Random rng = new Random();
using (FileStream stream = File.OpenWrite(fileName))
{
// There
for (int i = 0; i < sizeInMb * blocksPerMb; i++)
{
rng.NextBytes(data);
stream.Write(data, 0, data.Length);
}
}
However, if you do this several times in very quick succession creating a new instance of Random
each time, you may get duplicate data. See my article on randomness for more information - you could avoid this using System.Security.Cryptography.RandomNumberGenerator
... or by reusing the same instance of Random
multiple times - with the caveat that it's not thread-safe.
There's no faster way then taking advantage of the sparse file support built into NTFS, the file system for Windows used on hard disks. This code create a one gigabyte file in a fraction of a second:
using System;
using System.IO;
class Program {
static void Main(string[] args) {
using (var fs = new FileStream(@"c:\temp\onegigabyte.bin", FileMode.Create, FileAccess.Write, FileShare.None)) {
fs.SetLength(1024 * 1024 * 1024);
}
}
}
When read, the file contains only zeros.
You can use this following class created by me for generate random strings
using System;
using System.Text;
public class RandomStringGenerator
{
readonly Random random;
public RandomStringGenerator()
{
random = new Random();
}
public string Generate(int length)
{
if (length < 0)
{
throw new ArgumentOutOfRangeException("length");
}
var stringBuilder = new StringBuilder();
for (int i = 0; i < length; i++)
{
char ch = (char)random.Next(0,255 );
stringBuilder.Append(ch);
}
return stringBuilder.ToString();
}
}
for using
int length = 10;
string randomString = randomStringGenerator.Generate(length);
The efficient way to create a large file:
FileStream fs = new FileStream(@"C:\temp\out.dat", FileMode.Create);
fs.Seek(1024 * 6, SeekOrigin.Begin);
System.Text.UTF8Encoding encoding = new System.Text.UTF8Encoding();
fs.Write(encoding.GetBytes("test"), 0, 4);
fs.Close();
However this file will be empty (except for the "test" at the end). Not clear what is it exactly you are trying to do -- large file with data, or just large file. You can modify this to sparsely write some data in the file too, but without filling it up completely.
If you do want the entire file filled with random data, then the only way I can think of is using Random bytes from Jon above.
An improvement would be to fill a buffer of the desired size with the data and flushing it all at once.