Why would anybody use the "standard" random number generator from System.Random at all instead of always using the cryptographically secure random number generator from System.Security.Cryptography.RandomNumberGenerator (or its subclasses because RandomNumberGenerator is abstract)?
Nate Lawson tells us in his Google Tech Talk presentation "Crypto Strikes Back" at minute 13:11 not to use the "standard" random number generators from Python, Java and C# and to instead use the cryptographically secure version.
I know the difference between the two versions of random number generators (see question 101337).
But what rationale is there to not always use the secure random number generator? Why use System.Random at all? Performance perhaps?
First of all the presentation you linked only talks about random numbers for security purposes. So it doesn't claim
Random
is bad for non security purposes.But I do claim it is. The .net 4 implementation of
Random
is flawed in several ways. I recommend only using it if you don't care about the quality of your random numbers. I recommend using better third party implementations.Flaw 1: The seeding
The default constructor seeds with the current time. Thus all instances of
Random
created with the default constructor within a short time-frame (ca. 10ms) return the same sequence. This is documented and "by-design". This is particularly annoying if you want to multi-thread your code, since you can't simply create an instance ofRandom
at the beginning of each thread's execution.The workaround is to be extra careful when using the default constructor and manually seed when necessary.
Another problem here is that the seed space is rather small (31 bits). So if you generate 50k instances of
Random
with perfectly random seeds you will probably get one sequence of random numbers twice (due to the birthday paradox). So manual seeding isn't easy to get right either.Flaw 2: The distribution of random numbers returned by
Next(int maxValue)
is biasedThere are parameters for which
Next(int maxValue)
is clearly not uniform. For example if you calculater.Next(1431655765) % 2
you will get0
in about 2/3 of the samples. (Sample code at the end of the answer.)Flaw 3: The
NextBytes()
method is inefficient.The per byte cost of
NextBytes()
is about as big as the cost to generate a full integer sample withNext()
. From this I suspect that they indeed create one sample per byte.A better implementation using 3 bytes out of each sample would speed
NextBytes()
up by almost a factor 3.Thanks to this flaw
Random.NextBytes()
is only about 25% faster thanSystem.Security.Cryptography.RNGCryptoServiceProvider.GetBytes
on my machine (Win7, Core i3 2600MHz).I'm sure if somebody inspected the source/decompiled byte code they'd find even more flaws than I found with my black box analysis.
Code samples
r.Next(0x55555555) % 2
is strongly biased:Performance: