I have a simple method I've written for a class homework assignment that uses recursion (yes, it must use recursion) to calculate the number of triangles in a fractal pattern:
public static BigInteger triangleFract(int layer) {
if(layer < 0) {
throw new IllegalArgumentException("Input must be >= 0");
} else if(layer == 0) {
return new BigInteger("0");
} else if (layer == 1) {
return new BigInteger("1");
} else {
return triangleFract(layer - 1)
.multiply(new BigInteger("3"))
.add(new BigInteger("2"));
}
}
What I've been trying to do is understand how big the int layer can be so as to limit user input. After some tests I get a stack overflow at around 6700+, which is fine.
What is troubling me is that if layer is in the thousands, the method usually runs, but it can still randomly encounter a StackOverflowError
.
For instance, I chose to limit layer to 4444, and it seems to be able to handle that almost always, but every once in a while it still seems to overflow.
Why does it do this? And is there anything that I can do about it?
I couldn't reproduce your 'fluctuation' effect. This is pretty deterministic code, so you should get the same result every time (including the stack overflow error).
How did you test it? Did you run a new jvm for each try of your 4444 test? (or was it just the triangleFrac(4444); called in a loop?).
What's your OS, java version etc...
I'm asking because I don't really like unsolved problems like this --- something like this could bite you where (and when) it hurts ;).
oh... BTW, for all it's worth, you should be using the ONE and ZERO constants from BigInteger (and create your own for 2 and 3, for that matter. This should save you quite a bit of memory (yes, I know, this wasn't your question).
Allowing recursion to that depth is a design smell.
Try this iterative version:
Notes:
BigInteger.ZERO
andBigInteger.ONE
instead of creating new instances for these valueselse
- there is noelse
after a terminating statement (egreturn
)new BigInteger("2")
andnew BigInteger("3")
rather than creating new instances every iterationConsider to move to iterated version. That what I think if you developing a recursion algorithm you must control the level depth or don't use recursion at all.
For those who can't reproduce this fluctuation. Find the
layer
value starting from which method will reliably throwStackOverflowError
. The closer this value to the real threshold the better. Now call this method from inside the loop (on my machinemaxLayer = 11500
):It will throw
StackOverflowError
. Now decrease this value a little bit (looks like 5-10% should be enough):Well on my machine this doesn't throw any error and successfully jumps over
11500
. Actually it works fine up until16000
.So whatever it is it probably involves JVM optimisations. I tried to run a program with
-XX:+PrintCompilation
. I saw how JIT does its job while looping:May it be compilation? Let's try to delay compilation so it affects us as late as possible. I tried to play with
-XX:CompileThreshold
flag and pretty soon found a value (-XX:CompileThreshold=1000000
) which doesn't let my loop jump over11500
.UPDATE
I finally reproduced it without any tweaking of compilation threshold. For me it looks like it only happens when I run program in my IDE (IntelliJ IDEA). So it may have something to do with IDEA's launcher. I copied its command line and used it in a small script:
And what I found out it usually prints out something less than 100 (95-98). Which is consistent to what I saw when did it manually. When I skip the launcher:
it always prints out 100.
Perhaps the JVM has determined (through escape analysis) that the BigInteger can be allocated on the stack rather than the heap. Depending on when it implements this optimization, the required stack size would vary.
That said, there could be many other causes, and the behaviour is likely to depend on the JVM you use.
There is actually something you can do: increase the maximum stack size. This is done at JVM startup with the option
-Xss
like this:Be careful not to set an excessively high value. If you have to go over 60M - 70M, then a redesign of your code is recommended.