AVCaptureDeviceOutput not calling delegate method

2020-08-09 10:01发布

问题:

I am building an iOS application (my first) that processes video still frames on the fly. To dive into this, I followed an example from the AV* documentation from Apple.

The process involves setting up an input (the camera) and an output. The output works with a delegate, which in this case is the controller itself (it conforms and implements the method needed).

The problem I am having is that the delegate method never gets called. The code below is the implementation of the controller and it has a couple of NSLogs. I can see the "started" message, but the "delegate method called" never shows.

This code is all within a controller that implements the "AVCaptureVideoDataOutputSampleBufferDelegate" protocol.

- (void)viewDidLoad {

    [super viewDidLoad];

    // Initialize AV session    
        AVCaptureSession *session = [AVCaptureSession new];

        if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
            [session setSessionPreset:AVCaptureSessionPreset640x480];
        else
            [session setSessionPreset:AVCaptureSessionPresetPhoto];

    // Initialize back camera input
        AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

        NSError *error = nil;

        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:camera error:&error];

        if( [session canAddInput:input] ){
            [session addInput:input];
        }


    // Initialize image output
        AVCaptureVideoDataOutput *output = [AVCaptureVideoDataOutput new];

        NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
                                           [NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
        [output setVideoSettings:rgbOutputSettings];
        [output setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked (as we process the still image)


        //[output addObserver:self forKeyPath:@"capturingStillImage" options:NSKeyValueObservingOptionNew context:@"AVCaptureStillImageIsCapturingStillImageContext"];

        videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
        [output setSampleBufferDelegate:self queue:videoDataOutputQueue];


        if( [session canAddOutput:output] ){
            [session addOutput:output];
        }

        [[output connectionWithMediaType:AVMediaTypeVideo] setEnabled:YES];


    [session startRunning];

    NSLog(@"started");


}


- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

        NSLog(@"delegate method called");

        CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];

        self.theImage.image = [UIImage imageWithCGImage: cgImage ];

        CGImageRelease( cgImage );

}

Note: I'm building with iOS 5.0 as a target.

Edit:

I've found a question that, although asking for a solution to a different problem, is doing exactly what my code is supposed to do. I've copied the code from that question verbatim into a blank xcode app, added NSLogs to the captureOutput function and it doesn't get called. Is this a configuration issue? Is there something I'm missing?

回答1:

Your session is a local variable. Its scope is limited to viewDidLoad. Since this is a new project, I assume it's safe to say that you're using ARC. In that case that object won't leak and therefore continue to live as it would have done in the linked question, rather the compiler will ensure the object is deallocated before viewDidLoad exits.

Hence your session isn't running because it no longer exists.

(aside: the self.theImage.image = ... is unsafe since it performs a UIKit action of the main queue; you probably want to dispatch_async that over to dispatch_get_main_queue())

So, sample corrections:

@implementation YourViewController
{
     AVCaptureSession *session;
}

- (void)viewDidLoad {

    [super viewDidLoad];

    // Initialize AV session    
        session = [AVCaptureSession new];

        if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
            [session setSessionPreset:AVCaptureSessionPreset640x480];
        else
         /* ... etc ... */
}


- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

        NSLog(@"delegate method called");

        CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];

        dispatch_sync(dispatch_get_main_queue(),
        ^{
            self.theImage.image = [UIImage imageWithCGImage: cgImage ];
            CGImageRelease( cgImage );
         });
}

Most people advocate using an underscore at the beginning of instance variable names nowadays but I omitted it for simplicity. You can use Xcode's built in refactor tool to fix that up after you've verified that the diagnosis is correct.

I moved the CGImageRelease inside the block sent to the main queue to ensure its lifetime extends beyond its capture into a UIImage. I'm not immediately able to find any documentation to confirm that CoreFoundation objects have their lifetime automatically extended when captured in a block.



回答2:

I've found one more reason why didOutputSampleBuffer delegate method may not be called — save to file and get sample buffer output connections are mutually exclusive. In other words, if your session already has AVCaptureMovieFileOutput and then you add AVCaptureVideoDataOutput, only AVCaptureFileOutputRecordingDelegate delegate methods are called.

Just for the reference, I couldn't find anywhere in AV Foundation framework documentation explicit description of this limitation, but Apple support confirmed this a few years ago, as noted in this SO answer.

One way to solve the problem is to remove AVCaptureMovieFileOutput entirely and manually write recorded frames to the file in didOutputSampleBuffer delegate method, alongside your custom buffer data processing. You may find these two SO answers useful.



回答3:

In my case the problem is there because I call

if ([_session canAddOutput:_videoDataOutput])
        [_session addOutput:_videoDataOutput];

before I call

[_session startRunning];

I'm just start call addOutput: after startRunning

Hope it's help somebody.