Using OpenGL 3.3 core profile, I'm rendering a full-screen "quad" (as a single oversized triangle) via gl.DrawArrays(gl.TRIANGLES, 0, 3) with the following shaders.
Vertex shader:
#version 330 core
#line 1
vec4 vx_Quad_gl_Position () {
const float extent = 3;
const vec2 pos[3] = vec2[](vec2(-1, -1), vec2(extent, -1), vec2(-1, extent));
return vec4(pos[gl_VertexID], 0, 1);
}
void main () {
gl_Position = vx_Quad_gl_Position();
}
Fragment shader:
#version 330 core
#line 1
out vec3 out_Color;
vec3 fx_RedTest (const in vec3 vCol) {
return vec3(0.9, 0.1, 0.1);
}
vec3 fx_Grayscale (const in vec3 vCol) {
return vec3((vCol.r * 0.3) + (vCol.g * 0.59) + (vCol.b * 0.11));
}
void main () {
out_Color = fx_RedTest(out_Color);
out_Color = fx_Grayscale(out_Color);
}
Now, the code may look a bit odd and the present purpose of this may seem useless, but that shouldn't phase the GL driver.
On a GeForce, a get a gray screen as expected. That is, the "grayscale effect" applied to the hard-coded color "red" (0.9, 0.1, 0.1).
However, Intel HD 4000 [driver 9.17.10.2932 (12-12-2012) version -- the newest as of today] always, repeatedly shows nothing but the following constantly-flickering noise pattern:
Now, just to experiment a little, I changed the fx_Grayscale() function around a little bit -- effectively it should be yielding the same visual result, just with slightly different coding:
vec3 fx_Grayscale (const in vec3 vCol) {
vec3 col = vec3(0.9, 0.1, 0.1);
col = vCol;
float x = (col.r * 0.3) + (col.g * 0.59) + (col.b * 0.11);
return vec3(x, x, x);
}
Again, Nvidia does the correct thing whereas Intel HD now always, repeatedly produces a rather different, but still constantly-flickering noise pattern:
Must I suspect (yet another) Intel GL driver bug, or do you see any issues with my GLSL code -- not from a prettiness perspective (it's part of a shader code-gen experimental project) but from a mere spec-correctness point of view?