I have been assigned wit the task to write a program that takes a sample raw YUV file and display it in a Cocoa OpenGL program.
I am an intern at my job and I have little or no clue how to start. I have been reading wikipedia & articles on YUV, but I couldn't find any good source code on how to open a raw YUV file, extract the data and convert it into RGB and display it in the view window.
Essentially, I need help with the following aspects of the task -how to extract the YUV data from the sample YUV file -how to convert the YUV data into RGB color space -how to display the RGB color space in OpenGL. (This one I think I can figure out with time, but I really need help with the first two points)
please either tell me the classes to use, or point me to places where i can learn about YUV graphic/video display
Adam Rosenfield’s comment is incorrect. On Macs, you can display YCbCr (the digital equivalent to YUV) textures using the
GL_YCBCR_422_APPLE
texture format, as specified in the APPLE_ycbcr_422 extension.This answer is not correct, see the other answers and comments. Original answer left below for posterity.
You can't display it directly.You'll need to convert it to an RGB texture. As you may have gathered from Wikipedia, there are a bunch of variations on the YUV color space. Make sure you're using the right one.For each pixel, the conversion from YUV to RGB is a straightforward linear transformation. You just do the same thing to each pixel independently.
Once you've converted the image to RGB, you can display it by creating a texture. You need to call
glGenTextures()
to allocate a texture handle,glBindTexture()
to bind the texture to the render context, andglTexImage2D()
to upload the texture data to the GPU. To render it, you again callglBindTexture()
, followed by the rendering of a quad with texture coordinates set up properly.To render:
And don't forget to call
glEnable(GL_TEXTURE_2D)
at some point during initialization, and callglDeleteTextures(1, &texture)
during shutdown.I've done this with YUV frames captured from a CCD camera. Unfortunately, there are a number of different YUV formats. I believe the one that Apple uses for the
GL_YCBCR_422_APPLE
texture format is technically 2VUY422. To convert an image from a YUV422 frame generated by an IIDC Firewire camera to 2VUY422, I've used the following:For efficient display of a YUV video source, you may wish to use Apple's client storage extension, which you can set up using something like the following:
This lets you quickly change out the data stored within your client-side video texture before each frame to be displayed on the screen.
To draw, you could then use code like the following: