I ran into what seemed a mysterious bug this morning that I feel very lucky to have stumbled upon a solution for quite quickly.
I was dividing by a counter to produce an average inside of a fragment shader, and of course when the counter is zero, the resulting color value became NaN.
During blending, NVidia gracefully treats a NaN as a 0 value, but Intel does not and appears to cascade the NaN, resulting in black fragments.
And so this bug persisted until I tested the code on an Intel machine.
I wonder if there is something I could do to "trap" invalid values. It seems that, just as in regular programming, the only surefire (and even then it doesn't feel bulletproof) way to deal with this is to carefully consider all possible cases when dividing numbers.
The standard way to detect a NaN is to see if a number is not equal to itself. Could I perhaps build a debug shader which checks each fragment to see if it is not equal to itself, and if that condition is met, set a flashing, conspicuous color? Does GLSL allow me to detect a NaN this way or am I stuck with undefined behavior whenever a value is invalid?