I'm trying to retrieve the pixel information for an alpha-only texture via glGetTexImage.
The problem is, the glGetTexImage-Call seems to read more data than it should, leading to memory corruption and a crash at the delete[]-Call. Here's my code:
int format;
glGetTexLevelParameteriv(target,0,GL_TEXTURE_INTERNAL_FORMAT,&format);
int w;
int h;
glGetTexLevelParameteriv(target,0,GL_TEXTURE_WIDTH,&w);
glGetTexLevelParameteriv(target,0,GL_TEXTURE_HEIGHT,&h);
if(w == 0 || h == 0)
return false;
if(format != GL_ALPHA)
return false;
unsigned int size = w *h *sizeof(unsigned char);
unsigned char *pixels = new unsigned char[size];
glGetTexImage(target,level,format,GL_UNSIGNED_BYTE,&pixels[0]);
delete[] pixels;
glGetError reports no errors, and without the glGetTexImage-Call it doesn't crash.
'target' is GL_TEXTURE_2D (The texture is valid and bound before the shown code), 'w' is 19, 'h' is 24, 'level' is 0.
If I increase the array size to (w *h *100) it doesn't crash either. I know for a fact that GL_UNSIGNED_BYTE has the same size as an unsigned char on my system, so I don't understand what's going on here.
Where's the additional data coming from and how can I make sure that my array is large enough?