-->

Capturing Images from AVCaptureSession

2020-02-08 21:57发布

问题:

I am learning about AVCaptureSession and how to capture multiple images with its delegate method

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
     didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
     fromConnection:(AVCaptureConnection *)connection

My goal is to capture 1 or many images with a predefined rate per second. For example, 1 or 2 images per 1 second. So I set

 AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
 captureOutput.alwaysDiscardsLateVideoFrames = YES; 
 captureOutput.minFrameDuration = CMTimeMake(1, 1);

When [self.captureSession startRunning]; is started my log file shows delegate is being called 20 times a second. Where is it coming from and how to capture images with my intended intervals?

回答1:

You can use the function given below and if you want to capture at specific intervals, then set a timer and call that function again.

-(IBAction)captureNow
    {

    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in [stillImageOutput connections])
    {
        for (AVCaptureInputPort *port in [connection inputPorts])
        {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] )
            {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) 
        {
            break;
        }
    }

    NSLog(@"About to request a capture from: %@", stillImageOutput);
    [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
    {

        CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
        if (exifAttachments)
        {
            // Do something with the attachments.
            NSLog(@"Attachments: %@", exifAttachments);
        }
        else
        { 
            NSLog(@"No attachments found.");
        }

        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
        UIImage *image = [[UIImage alloc] initWithData:imageData];
        [[self vImage] setImage:image];

    }];
}

For more reference you can see iOS4: Take photos with live video preview using AVFoundation.



回答2:

Something that I struggled with for a while was a massive delay (~5 sec) when taking a picture, and trying to set a UIImage with the captured image. in the

 - (void)captureOutput:(AVCaptureOutput *)captureOutput 
 didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
 fromConnection:(AVCaptureConnection *)connection

method, you cant use normal functions such as [self.image setImage:img] for things that are linked to the UI, you have to run them on the main thread like so:

 [self.image performSelectorOnMainThread:@selector(setImage:) withObject:img waitUntilDone:TRUE];

Hope this helps someone