First random number after setSeed in Java always s

2019-02-05 11:07发布

To give some context, I have been writing a basic Perlin noise implementation in Java, and when it came to implementing seeding, I had encountered a bug that I couldn't explain.

In order to generate the same random weight vectors each time for the same seed no matter which set of coordinates' noise level is queried and in what order, I generated a new seed (newSeed), based on a combination of the original seed and the coordinates of the weight vector, and used this as the seed for the randomization of the weight vector by running:

rnd.setSeed(newSeed);
weight = new NVector(2);
weight.setElement(0, rnd.nextDouble() * 2 - 1);
weight.setElement(1, rnd.nextDouble() * 2 - 1);
weight.normalize()

Where NVector is a self-made class for vector mathematics.

However, when run, the program generated very bad noise: Very bad noise, with vertical streaks

After some digging, I found that the first element of each vector was very similar (and so the first nextDouble() call after each setSeed() call) resulting in the first element of every vector in the vector grid being similar.

This can be proved by running:

long seed = Long.valueOf(args[0]);
int loops = Integer.valueOf(args[1]);
double avgFirst = 0.0, avgSecond = 0.0, avgThird = 0.0;
double lastfirst = 0.0, lastSecond = 0.0, lastThird = 0.0;
for(int i = 0; i<loops; i++)
{
    ran.setSeed(seed + i);
    double first = ran.nextDouble();
    double second = ran.nextDouble();
    double third = ran.nextDouble();
    avgFirst += Math.abs(first - lastfirst);
    avgSecond += Math.abs(second - lastSecond);
    avgThird += Math.abs(third - lastThird);
    lastfirst = first;
    lastSecond = second;
    lastThird = third;
}
System.out.println("Average first difference.: " + avgFirst/loops);
System.out.println("Average second Difference: " + avgSecond/loops);
System.out.println("Average third Difference.: " + avgSecond/loops);

Which finds the average difference between the first, second and third random numbers generated after a setSeed() method has been called over a range of seeds as specified by the program's arguments; which for me returned these results:

C:\java Test 462454356345 10000
Average first difference.: 7.44638117976783E-4
Average second Difference: 0.34131692827329957
Average third Difference.: 0.34131692827329957

C:\java Test 46245445 10000
Average first difference.: 0.0017196011123287126
Average second Difference: 0.3416750057190849
Average third Difference.: 0.3416750057190849

C:\java Test 1 10000
Average first difference.: 0.0021601598225344998
Average second Difference: 0.3409914232342002
Average third Difference.: 0.3409914232342002

Here you can see that the first average difference is significantly smaller than the rest, and seemingly decreasing with higher seeds.

As such, by adding a simple dummy call to nextDouble() before setting the weight vector, I was able to fix my perlin noise implementation:

rnd.setSeed(newSeed);
rnd.nextDouble();
weight.setElement(0, rnd.nextDouble() * 2 - 1);
weight.setElement(1, rnd.nextDouble() * 2 - 1);

Resulting in:

enter image description here

I would like to know why this bad variation in the first call to nextDouble() (I have not checked other types of randomness) occurs and/or to alert people to this issue.

Of course, it could just be an implementation error on my behalf, which I would be greatful if it were pointed out to me.

3条回答
闹够了就滚
2楼-- · 2019-02-05 11:49

The Random class is designed to be a low overhead source of pseudo-random numbers. But the consequence of the "low overhead" implementation is that the number stream has properties that are a long way off perfect ... from a statistical perspective. You have encountered one of the imperfections. Random is documented as being a Linear Congruential generator, and the properties of such generators are well known.

There are a variety of ways of dealing with this. For example, if you are careful you can hide some of the most obvious "poor" characteristics. (But you would be advised to run some statistical tests. You can't see non-randomness in the noise added to your second image, but it could still be there.)

Alternatively, if you want pseudo-random numbers that have guaranteed good statistical properties, then you should be using SecureRandom instead of Random. It has significantly higher overheads, but you can be assured that many "smart people" will have spent a lot of time on the design, testing and analysis of the algorithms.

Finally, it is relatively simple to create a subclass of Random that uses an alternative algorithm for generating the numbers; see link. The problem is that you have to select (or design) and implement an appropriate algorithm.


Calling this an "issue" is debatable. It is a well known and understood property of LCGs, and use of LCGs was a concious engineering choice. People want low overhead PRNGs, but low overhead PRNGs have poor properties. TANSTAAFL.

Certainly, this is not something that Oracle would contemplate changing in Random. Indeed, the reasons for not changing are stated clearly in the javadoc for the Random class.

"In order to guarantee this property, particular algorithms are specified for the class Random. Java implementations must use all the algorithms shown here for the class Random, for the sake of absolute portability of Java code."

查看更多
We Are One
3楼-- · 2019-02-05 11:53

Move your setSeed out of the loop. Java's PRNG is a linear congruential generator, so seeding it with sequential values is guaranteed to give results that are correlated across iterations of the loop.

ADDENDUM

I dashed that off before running out the door to a meeting, and now have time to illustrate what I was saying above.

I've written a little Ruby script which implements Schrage's portable prime modulus multiplicative linear congruential generator. I instantiate two copies of the LCG, both seeded with a value of 1. However, in each iteration of the output loop I reseed the second one based on the loop index. Here's the code:

# Implementation of a Linear Congruential Generator (LCG)
class LCG
  attr_reader :state
  M = (1 << 31) - 1    # Modulus = 2**31 - 1, which is prime

  # constructor requires setting a seed value to use as initial state
  def initialize(seed)
    reseed(seed)
  end

  # users can explicitly reset the seed.
  def reseed(seed)
    @state = seed.to_i
  end

  # Schrage's portable prime modulus multiplicative LCG
  def value
    @state = 16807 * @state % M
    # return the generated integer value AND its U(0,1) mapping as an array
    [@state, @state.to_f / M]
  end
end

if __FILE__ == $0
  # create two instances of LCG, both initially seeded with 1
  mylcg1 = LCG.new(1)
  mylcg2 = LCG.new(1)
  puts "   default progression     manual reseeding"
  10.times do |n|
    mylcg2.reseed(1 + n)  # explicitly reseed 2nd LCG based on loop index
    printf "%d %11d %f %11d %f\n", n, *mylcg1.value, *mylcg2.value
  end
end

and here's the output it produces:

   default progression     manual reseeding
0       16807 0.000008       16807 0.000008
1   282475249 0.131538       33614 0.000016
2  1622650073 0.755605       50421 0.000023
3   984943658 0.458650       67228 0.000031
4  1144108930 0.532767       84035 0.000039
5   470211272 0.218959      100842 0.000047
6   101027544 0.047045      117649 0.000055
7  1457850878 0.678865      134456 0.000063
8  1458777923 0.679296      151263 0.000070
9  2007237709 0.934693      168070 0.000078

The columns are iteration number followed by the underlying integer generated by the LCG and the result when scaled to the range (0,1). The left set of columns show the natural progression of the LCG when allowed to proceed on its own, while the right set show what happens when you reseed on each iteration.

查看更多
够拽才男人
4楼-- · 2019-02-05 12:10

This is known issue. Similar seed will generate similar few first values. Random wasn't really designed to be used this way. You are supposed to create instance with a good seed and then generate moderately sized sequence of "random" numbers.

Your current solution is ok - as long as it looks good and is fast enough. You can also consider using hashing/mixing functions which were designed to solve your problem (and then, optionally, using the output as seed). For example see: Parametric Random Function For 2D Noise Generation

查看更多
登录 后发表回答