From iOS6, Apple has given the provision to use native YUV to CIImage through this call
initWithCVPixelBuffer:options:
In the core Image Programming guide, they have mentioned about this feature
Take advantage of the support for YUV image in iOS 6.0 and later. Camera pixel buffers are natively YUV but most image processing algorithms expect RBGA data. There is a cost to converting between the two. Core Image supports reading YUB from CVPixelBuffer objects and applying the appropriate color transform.
options = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_420YpCvCr88iPlanarFullRange) };
But, I am unable to use it properly. I have a raw YUV data. So, this is what i did
void *YUV[3] = {data[0], data[1], data[2]};
size_t planeWidth[3] = {width, width/2, width/2};
size_t planeHeight[3] = {height, height/2, height/2};
size_t planeBytesPerRow[3] = {stride, stride/2, stride/2};
CVPixelBufferRef pixelBuffer = NULL;
CVReturn ret = CVPixelBufferCreateWithPlanarBytes(kCFAllocatorDefault,
width,
height,
kCVPixelFormatType_420YpCbCr8PlanarFullRange,
nil,
width*height*1.5,
3,
YUV,
planeWidth,
planeHeight,
planeBytesPerRow,
nil,
nil, nil, &pixelBuffer);
NSDict *opt = @{ (id)kCVPixelBufferPixelFormatTypeKey :
@(kCVPixelFormatType_420YpCbCr8PlanarFullRange) };
CIImage *image = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer options:opt];
I am getting nil for image. Anyy idea what I am missing.
EDIT: I added lock and unlock base address before call. Also, I dumped the data of pixelbuffer to ensure pixellbuffer propely hold the data. It looks like something wrong with the init call only. Still CIImage object is returning nil.
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
CIImage *image = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer options:opt];
CVPixelBufferUnlockBaseAddress(pixelBuffer,0);
I am working on a similar problem and kept finding that same quote from Apple without any further information on how to work in a YUV color space. I came upon the following:
I note that there are no YUV color spaces, only Gray and RGB; and their calibrated cousins. I'm not sure how to convert the color space yet, but will certainly report here if I find out.
There should be error message in console:
initWithCVPixelBuffer failed because the CVPixelBufferRef is not IOSurface backed
. See Apple's Technical Q&A QA1781 for how to create an IOSurface-backedCVPixelBuffer
.