OpenGL ES 2.0 textures for retina display?

2019-08-13 18:47发布

问题:

I have got a GLKView, where I try to draw a couple of cubes and I create textures from a view and map them onto the cubes. However, when I start the app on a retina device, the textures are correctly sized but they look terrible. I have tried to set the contentScaleFactor of the GLKView to the scale of the main screen - to no avail. I have also tried to multiply the the buffers dimensions by the scale, which resulted in textures that looked crisp, but were only 1/4 of the original size...

Without further ado, I may present you what I have done (without above indicated multiplication):

GLKView

- (void)setupGL {

    UIScreen *mainScreen = [UIScreen mainScreen];
    const CGFloat scale = mainScreen.scale;
    self.contentScaleFactor = scale;
    self.layer.contentsScale = scale;

    glGenFramebuffers(1, &defaultFrameBuffer);
    glBindFramebuffer(GL_FRAMEBUFFER, defaultFrameBuffer);

    glGenRenderbuffers(1, &depthBuffer);
    glBindRenderbuffer(GL_RENDERBUFFER, depthBuffer);

    glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, self.bounds.size.width, self.bounds.size.height);
    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthBuffer);

    glGenRenderbuffers(1, &colorBuffer);
    glBindRenderbuffer(GL_RENDERBUFFER, colorBuffer);
    glRenderbufferStorage(GL_RENDERBUFFER, GL_RGBA4, self.bounds.size.width, self.bounds.size.height);
    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ARRAY, GL_RENDERBUFFER, colorBuffer);

    glEnable(GL_DEPTH_TEST);

}

Here I load the textures

// make space for an RGBA image of the view
 GLubyte *pixelBuffer = (GLubyte *)malloc(
                                         4 *
                                         cV.bounds.size.width * 
                                         cV.bounds.size.height);

// create a suitable CoreGraphics context
CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context =
CGBitmapContextCreate(pixelBuffer,
                      cV.bounds.size.width, cV.bounds.size.height,
                      8, 4*cV.bounds.size.width,
                      colourSpace,
                      kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colourSpace);

// draw the view to the buffer
[cV.layer renderInContext:context];

// upload to OpenGL
glTexImage2D(GL_TEXTURE_2D, 0,
             GL_RGBA,
             cV.bounds.size.width, cV.bounds.size.height, 0,
             GL_RGBA, GL_UNSIGNED_BYTE, pixelBuffer);



glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

回答1:

The answer to this question can be found here

How to create a CGBitmapContext which works for Retina display and not wasting space for regular display?

What I basically did, is to multiply the texture and the buffer by the screen's scale factor and because this only yielded a texture that was 1/4 of the size, I had to multiply the context by the scale factor as well

CGContextScaleCTM(context, scaleFactor, scaleFactor);