FYI: random == pseudo-random
A. when generating uniformly-random numbers, I can specify a range, i.e.:
(Math.random()-Math.random())*10+5
//generates numbers between -5 and 15
B. generating a set of random values with a version of Gaussian-esque normal randomness:
//pass in the mean and standard deviation
function randomNorm(mean, stdev) {
return Math.round((Math.random()*2-1)+(Math.random()*2-1)+(Math.random()*2-1))*stdev+mean);
}
//using the following values:
{
mean:400,
standard_deviation:1
//results in a range of 397-403, or +-range of 3
},
{
mean:400,
standard_deviation:10
//results in a range of 372-429, or +-range of 30
},
{
mean:400,
standard_deviation:25
//results in a range of 326-471, or +-range of 75
}
each one gives me a range of approximately standard_deviation*(+-3) (assuming I left the program running longer).
C. I can calculate this range as follows:
- assuming I want a range from 300-500, so var total_range = 200;
- my mean is 400, my +-range is total_range/2 (var r = 100)
- so standard_deviation would be r/3 or in this case 33.333.
This seems to be working, but I have no idea what I'm doing with math so I feel like an idiot, this solution feels kludgy and not totally accurate.
My question: is there some formula that I'm dancing around that can help me here? my requirements are as follows:
- must be able to define a range of numbers accurately.
- must be done in JavaScript, as efficiently as possible.
I think maybe I'm close but it's not quite there.