If I run this test:
var r = new Random();
var ints = new int[13];
Parallel.For(0, 2000000, i => {
var result = r.Next(1, 7) + r.Next(1, 7);
ints[result] += 1;
});
I get the following result:
2: 92,14445
3: 0,41765
4: 0,62245
5: 0,82525
6: 1,04035
7: 1,25215
8: 1,0531
9: 0,8341
10: 0,6334
11: 0,4192
12: 0,2109
When I use the regular For:
for (int i = 0; i < 2000000; i++) {
var result = r.Next(1, 7) + r.Next(1, 7);
ints[result] += 1;
}
The output is:
2: 2,7797
3: 5,58645
4: 8,3414
5: 11,09935
6: 13,8909
7: 16,6731
8: 13,82895
9: 11,10205
10: 8,3424
11: 5,5712
12: 2,7845
The last result is a Triangular Distribution and it is the expected output.
The purpose of my question is not to discuss the applicability of parallelism. The question is why the Parallel.For behaves that way?
In addition to @spencerruport's assertion that the Random class is not thread safe, your parallel code is also not threadsafe:
It might be better to leverage PLINQ to do the collecting of results on your behalf:
instead of managing access to shared memory (your
ints
array above) yourself.A reasonable implementation of
SafeRandom
might look something like this:It is thread safety of
Random
.I get the below distribution as expected once I've made the call to
Random.Next()
thread safe.The Random class methods are not thread safe.
http://msdn.microsoft.com/en-us/library/system.random.next(v=vs.90).aspx#2
So the first piece of code is just demonstrating some undefined behavior.
EDIT:
As for a little speculation, from what little I know about operating systems I believe random number generation is a pretty low level operation and hence might even require a context switch. While this is happening you may end up grabbing the same random number multiple times before it's had a chance to update. This would account for the lopsided distribution.