I am building an iOS application (my first) that processes video still frames on the fly. To dive into this, I followed an example from the AV* documentation from Apple.
The process involves setting up an input (the camera) and an output. The output works with a delegate, which in this case is the controller itself (it conforms and implements the method needed).
The problem I am having is that the delegate method never gets called. The code below is the implementation of the controller and it has a couple of NSLogs. I can see the "started" message, but the "delegate method called" never shows.
This code is all within a controller that implements the "AVCaptureVideoDataOutputSampleBufferDelegate" protocol.
- (void)viewDidLoad {
[super viewDidLoad];
// Initialize AV session
AVCaptureSession *session = [AVCaptureSession new];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
[session setSessionPreset:AVCaptureSessionPreset640x480];
else
[session setSessionPreset:AVCaptureSessionPresetPhoto];
// Initialize back camera input
AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:camera error:&error];
if( [session canAddInput:input] ){
[session addInput:input];
}
// Initialize image output
AVCaptureVideoDataOutput *output = [AVCaptureVideoDataOutput new];
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[output setVideoSettings:rgbOutputSettings];
[output setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked (as we process the still image)
//[output addObserver:self forKeyPath:@"capturingStillImage" options:NSKeyValueObservingOptionNew context:@"AVCaptureStillImageIsCapturingStillImageContext"];
videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[output setSampleBufferDelegate:self queue:videoDataOutputQueue];
if( [session canAddOutput:output] ){
[session addOutput:output];
}
[[output connectionWithMediaType:AVMediaTypeVideo] setEnabled:YES];
[session startRunning];
NSLog(@"started");
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSLog(@"delegate method called");
CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
self.theImage.image = [UIImage imageWithCGImage: cgImage ];
CGImageRelease( cgImage );
}
Note: I'm building with iOS 5.0 as a target.
Edit:
I've found a question that, although asking for a solution to a different problem, is doing exactly what my code is supposed to do. I've copied the code from that question verbatim into a blank xcode app, added NSLogs to the captureOutput function and it doesn't get called. Is this a configuration issue? Is there something I'm missing?