Weird OpenGL issue when factoring out code

2019-03-03 03:55发布

问题:

So I have some code that creates a buffer and puts some vertices in it:

GLuint vertexbuffer;
glGenBuffers(1, &vertexbuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(g_vertex_buffer_data), g_vertex_buffer_data, GL_STATIC_DRAW);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0);
glBindBuffer(GL_ARRAY_BUFFER, 0);
glEnableVertexAttribArray(0);

I also bind it to a shader attribute:

glBindAttribLocation(programID, 0, "pos");

And, finally, draw it:

glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glDrawArrays(GL_TRIANGLES, 0, 3);

Of course, there is other code, but all of this stuff runs fine (displays a red triangle on the screen)

However, the instant I try to factor this stuff out in a struct, nothing will display (here is one of the methods):

void loadVerts(GLfloat verts[], int indices)
    {
        GLuint vertexbuffer;
        glGenBuffers(1, &vertexbuffer);
        glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
        glBufferData(GL_ARRAY_BUFFER, sizeof(verts), verts, GL_STATIC_DRAW);
        glVertexAttribPointer(indice, indices, GL_FLOAT, GL_FALSE, 0, 0);
        glBindBuffer(GL_ARRAY_BUFFER, 0);
        glEnableVertexAttribArray(indice);
        indice++;
        buffers.push_back(vertexbuffer);
    }

I've quadruple checked this code, and I've also traced it to make sure it would match the code above whenever its called. My draw call is almost the same as my original:

void draw()
    {
        glBindBuffer(GL_ARRAY_BUFFER, buffers.at(0));
        glDrawArrays(GL_TRIANGLES, 0, 3);
    }

I've also tried making this a class, and adding/changing many parts of the code. buffers and indice are just some vars to keep track of buffers and attribute indexes. buffers is an std::vector<GLuint> FWIW.

回答1:

The main problem is here:

void loadVerts(GLfloat verts[], int indices)
{
    ...
    glBufferData(GL_ARRAY_BUFFER, sizeof(verts), verts, GL_STATIC_DRAW);

The type of the verts argument is a pointer to GLfloat. Your function signature is equivalent to:

void loadVerts(GLfloat* verts, int indices)

So sizeof(verts), which is used as the second argument to glBufferData(), is 4 on a 32-bit architecture, 8 on a 64-bit architecture.

You will need to pass the size as an additional argument to this function, and use that value as the second argument to glBufferData().

These statements also look somewhat confusing:

glVertexAttribPointer(indice, indices, GL_FLOAT, GL_FALSE, 0, 0);
glEnableVertexAttribArray(indice);

I can't tell if there's a real problem, but you have two variables with very similar names that are used very differently. indice needs to be the location of the attribute in your vertex shader, while indices needs to be the number of components in the attribute.