Look at the following OpenGL function:
void glTexImage2D(GLenum target,
GLint level,
GLint internalFormat,
GLsizei width,
GLsizei height,
GLint border,
GLenum format,
GLenum type,
const GLvoid * data);
I know the parameter format and type describe the format and type of the image data,but I don't understand the prameter internalFormat.How should I set its value in my application?
For example,I create a texture like this:
glTexImage2D(GL_TEXTURE_2D,0,GL_LUMINANCE8,size,size,0,GL_LUMINANCE,GL_UNSIGNED_BYTE,buffer);
When I aceess the texture in it in my GLSL shader,it seems that the value that I get is between [0,1].WHy?Shouldn't it between [0,255]?
Part of My shader code is :
vec = EntryPoint + delta_dir * texture(noiseTex,EntryPoint.xy * 32).x;
Part of my C++ Code :
for (int i = 0;i < temp;++i)
{
buffer[i] = 255.0 * rand() / (float)RAND_MAX;
}
glGenTextures(1,&noiseTex);
glActiveTexture(GL_TEXTURE0 + activeTexUnit);
glBindTexture(GL_TEXTURE_2D,noiseTex);
glTexImage2D(GL_TEXTURE_2D,0,GL_LUMINANCE8,size,size,
0,GL_LUMINANCE,GL_UNSIGNED_BYTE,buffer);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);