My app draws a mixture of point sprites and standard quad textures. Until recently, I was using a single shader program for this - however a branch in the fragment shader (to decide whether to use point sprite coords/colors) was seriously throttling the performance of my rendering. My fragment shader originally looked like this:
precision highp float;
uniform sampler2D u_textureSprite;
uniform bool u_isSprite;
varying vec4 v_color;
varying vec2 v_textureCoord;
void main()
{
vec4 textureColor = texture2D(u_textureSprite, u_isSprite ? gl_PointCoord : v_textureCoord);
if (u_isSprite) {
gl_FragColor = v_color * textureColor;
}
else {
gl_FragColor = textureColor;
}
}
Reading Apple's recommendations in the OpenGL ES Programming guide makes it sound like an easy job to use multiple shader programs... just create another one as normal and call glUseProgram() before the relevant draw code. However, since doing this I cannot get the texture rendering to work. The two new fragment shaders are:
pointSprite.fsh:
precision highp float;
uniform sampler2D s_textureSprite;
varying vec4 v_color;
void main()
{
vec4 textureColor = texture2D(s_textureSprite, gl_PointCoord);
gl_FragColor = v_color * textureColor;
}
texture.fsh:
precision highp float;
uniform sampler2D s_texture;
varying vec2 v_textureCoord;
void main()
{
gl_FragColor = texture2D(s_texture, v_textureCoord);
}
Pretty trivial, I think. If I ignore all calls to draw the texture, I can see that point sprites are being rendered just fine. Texture quads, however, just render to a solid grey color - incidentally this grey color does not match any of the glClearColor() colors, I believe it is a color from some point of the texture that it is trying to render.
A little sample code for a texture render is as follows:
- (void)drawTexture:(GLuint)texture {
glUseProgram(programs[PROGRAM_TEXTURE]);
glBindTexture(GL_TEXTURE_2D, texture);
glBindBuffer(GL_ARRAY_BUFFER, vertexBufferObjects[VBO_TEXTURE_VERTICES]);
glVertexAttribPointer(ATTRIB_VERTEX, 2, GL_FLOAT, GL_FALSE, sizeof(ISTextureData), (void*)offsetof(ISTextureData, vertex));
glEnableVertexAttribArray(ATTRIB_VERTEX);
glVertexAttribPointer(ATTRIB_TEXTURE_COORD, 2, GL_UNSIGNED_BYTE, GL_TRUE, sizeof(ISTextureData), (void*)offsetof(ISTextureData, textureCoord));
glEnableVertexAttribArray(ATTRIB_TEXTURE_COORD);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vertexBufferObjects[VBO_TEXTURE_INDICIES]);
glDrawElements(GL_TRIANGLE_STRIP, 4, GL_UNSIGNED_SHORT, (void*)0);
glBindBuffer(GL_ARRAY_BUFFER, 0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
glBindTexture(GL_TEXTURE_2D, 0);
}
This code works when not dealing with multiple shaders. Or am I being careless, and the texture fragment shader code is wrong?
Thanks
EDIT:
Well it's been a number of long and painful hours, and what's really irritating is that it's somehow not a code issue, but something to do with the name of the project directory.
My Xcode project directory is named as such: "ProjectName (current)". This is the project I was having trouble with. I have since discovered that changing this folder name to anything different makes the app work, and properly render the background texture. Just to clarify - the app previously worked normally with the exception of the rendering of a background texture in OpenGL, and after renaming the project folder everything works fine.
Any ideas why this might be?