I'm trying to get as good an image as possible from the camera, but can only find examples that captureStillImageAsynchronouslyFromConnection
and then go straight to:
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
JPEG being lossy and all, is there any way to get the data as PNG, or even just RGBA (BGRA, what-have-you?). AVCaptureStillImageOutput doesn't seem to have any other NSData* methods....
Actually looking at the CMSampleBufferRef, it seems like it's already locked as JPEG ~
formatDescription = <CMVideoFormatDescription 0xfe5e1f0 [0x3e5ac650]> {
mediaType:'vide'
mediaSubType:'jpeg'
mediaSpecific: {
codecType: 'jpeg' dimensions: 2592 x 1936
}
extensions: {(null)}
}
Is there some other way to take a full-res picture and get the raw data?
You'll need to set the outputSettings with a different pixel format. If you want 32-bit BGRA, for example, you can set:
From https://developer.apple.com/library/mac/#documentation/AVFoundation/Reference/AVCaptureStillImageOutput_Class/Reference/Reference.html, the "recommended" pixel formats are:
kCMVideoCodecType_JPEG
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
kCVPixelFormatType_32BGRA
Of course, if you're not using JPEG output, you can't use
jpegStillImageNSDataRepresentation:
, but there's an example here: how to convert a CVImageBufferRef to UIImageJust change the Output settings of your connection:
outputSettings The compression settings for the output.
You can retreive an NSDictionary of the supported values with
availableImageDataCodecTypes The supported image codec formats that can be specified in outputSettings. (read-only)