My Perlin noise function (which adds up 6 octaves of 3D simplex at 0.75 persistence) generates a 2D array array of double
s.
These numbers each come out normalized to [-1, 1], with mean at 0. I clamp them to avoid exceptions, which I think are due to floating-point accuracy issues, but I am fairly sure my scaling factor is good enough for restricting the noise output to exactly this neighborhood in the ideal case.
Anyway, that's all details. The point is, here is a 256-by-256 array of noise:
The histogram with a normal fit looks like this:
Matlab's lillietest
is a function which applies the Lilliefors test to determine if a set of numbers comes from a normal distribution. My result was, repeatedly, 1, which means that these numbers are not normally distributed.
I would like a function f(x)
such that, when applied to the list of values from my noise function, the results appear uniformly distributed.
I would like this function to be implementable in C# and not take minutes to run.
Once again, it shouldn't matter where the numbers come from (the question is about transforming one distribution into another, specifically a normal-like one to uniform). Nevertheless, my noise function implementation is based on this and this. You can find the above array of values here.