There's a similar question, I know, but it confused me, so I thought it easier to ask in my way.
So I have an array of values, positive and negative. The higher they are, the more probability they have of being chosen.
I'm having trouble actually figuring out how to assign the probabilities and then randomly choose one. I'm guessing the array will need to be sorted first, but then I'm a bit lost after that.
"I have various different sizes of cups of coffee. The larger they are, the more I want to charge for them. I'm having trouble actually figuring out how to assign prices".
This isn't just a programming problem - you've specified that probability increases with value, but you haven't said how it increases with value. Normally, coffee shops don't charge in direct proportion to the amount of coffee. You can't assign probabilities in proportion to value, because some of your values are negative, but probabilities cannot be negative.
Sounds like you need to nail down the problem a bit more before you can write any code.
If you really don't care how probability relates to value, other than that they increase in order of value, then one easy way would be:
- sort your array
- assign a probability of 1 to the first element, 2 to the second, and so on.
- now, your probabilities don't add up to 1, which is a problem. So divide each probability by the total of all the probabilities you have assigned:
(1 + 2 + .. + n) = n(n+1)/2
. This is called "normalization".
Given your list of probabilities, which add up to 1, the easiest way to repeatedly choose one is generally to calculate the cumulative probabilities, which I will demonstrate with an example:
value (sorted): -12 -3 127 1000000
assigned probability: 0.1 0.2 0.3 0.4
cumulative probability: 0.1 0.3 0.6 1.0
The cumulative probability is defined as the sum of all the probabilities up to that point.
Now, from your random number generator you need a random (floating-point) value between 0 and 1. If it lies between 0 and 0.1, you've picked -12. If it lies between 0.1 and 0.3, you've picked -3, and so on. To figure out which range it lies in, you could walk linearly through your array, or you could do a binary search.
You could skip the normalization step and the use of floating-point, if you wanted. Assign "cumulative probabilities" (1, 3, 6, 10 ...) , but make it understood that the actual probability is the stored integer value divided by n(n+1)/2. Then choose a random integer from 0 to n(n+1)/2 - 1. If it's less than 1, you've selected the first value, else if less than 3 the second, and so on. This may or may not make the code clearer, and your RNG may or may not do well choosing integer values from a large range.
Note that you could have assigned probabilities (0.001, 0.002, 0.003, 0.994) instead of (0.1, 0.2, 0.3, 0.4), and still satisfied your requirement that "the higher the value, the higher the probability".
One way could be
- Make all values positive (add absolute value of the minimum value to all values)
- Normalize the values to sum to 1 (divide each value with the sum of the values)
To randomize a value from the generated distribution now you can
- Pick random number on [0,1].
- Start summing the probabilites until the sum is greater or equal to the random value. Choose that index as your random value.
Following up on Steve Jessop's suggestion, after you've chosen a random integer from 0 to n(n+1)/2 - 1, you can just get the triangular root: (-1 + sqrt((8*x)+1))/2