I'm trying to create a percentage-based probability for a game. E.g. if an item has a 45% chance of a critical hit, that must mean it is 45 of 100 hits would be critical.
First, I tried to use a simple solution:
R = new Random();
int C = R.Next(1, 101);
if (C <= ProbabilityPercent) DoSomething()
But in 100 iterations with a chance of e.g. 48%, it gives 40-52 out of 100. Same goes for 49, 50, 51. So, there is no difference between these "percents".
The question is how to set a percentage of e.g. 50, and get strictly 50 of 100 with random? It is a very important thing for probability of rare item finding with an opportunity to increase a chance to find with an item. So the buff of 1% would be sensinble, because now it is not.
Sorry for my bad English.
You need to think only in terms of uniform distribution over repeated rolls.
You can't look over 100 rolls, because forcing that to yield exactly 45 would not be random. Usually, such rolls should exhibit "lack of memory". For example, if you roll a dice looking for a 6, you have a 1-in-6 chance. If you roll it 5 times, and don't get a six - then: the chance of getting a 6 on the next roll is not 1. It is still 1 in 6. As such, you can only look at how well it met your expectation when amortized over a statistically large number of events... 100,000 say.
Basically: your current code is fine. If the user knows (because they've hit 55 times without a critical) that the next 45 hits must be critical, then it is no longer random and they can game the system.
Also; 45% chance of critical hit seems a bit high ;p
The thing is, with
Random
you might want to initialize this class only once.This is because
Random
uses the system time as a seed for generating random numbers. If your loop is very fast it could happen that multiple Random-instances are using the same seed and thus generating the same numbers.Check the generated numbers if you suspect this is happening.
Besides this, inherent to Randomness is that it won't give you exact results. This means that even with a 50/50 chance it could happen that a sequence of 100 "heads" or "tails" gives "heads" 100 times.
The only thing you can do is create the Random-instance once and live with the results; otherwise you shouldn't use Random.
No, that's not true. You missunderstud completely the concept of Probability. You dont want a "percentage-based probability", you want a "percentage-based random distribution of 100 samples".
What you need is a "bag of events", 45 of them "crytical" and 55 of them "non crytical". Once you pick an event from the bag, you only have the remaining events to pick the next time.
You can model it this way:
Since I am no expert in C# I will use a C++ function for ease but still applicable for any language.
rand() - random number generator.
I have used this method to create very large percolated grids and it works excellent for precision values.