ios/iphone photo burst mode api

2020-05-24 05:55发布

I'm trying to capture multiple photos on highest resolution(AVCaptureSessionPresetPhoto) on iPhone 5s. I tried using the following code:

    dispatch_semaphore_t sync = dispatch_semaphore_create(0);
while( [self isBurstModeEnabled] == YES )
                {
        [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
                             {

                                 if (imageSampleBuffer != NULL)
                                 {
                                     NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
                                     NSString *videoThumbPath = [NSString
                                                                 stringWithFormat:@"%@/img%d.png",
                                                                 burstFolderPath,
                                                                 index];

                                     [imageData writeToFile:videoThumbPath atomically:YES];
                                     if( 0 == index )
                                     {
                                         [self NSLogPrint:[NSString stringWithFormat:@"Created photo at %@",videoThumbPath]];
                                     }
                                 }
                                 dispatch_semaphore_signal(sync);
                             }];
    dispatch_semaphore_wait(sync, DISPATCH_TIME_FOREVER);
}

Using this code I can get about 2 photos per second, no way near the performance of burst mode of the native camera app. What am I doing wrong? Also I tried using the code above without the semaphore, but in that case I was having weird behavior, some photos were missing(img0.png img1.png img3.png would be present but img2.png would be missing). Using the second method the performance would be better but still not on par with the native app performance(in my tests the camera app would make about 8.4 photos per second).

2条回答
劫难
2楼-- · 2020-05-24 06:10

captureStillImageAsynchronouslyFromConnection:completionHandler: isn't, I believe, what Apple is using for its burst mode.

Instead, Apple is* grabbing video frames at full resolution (which is supported by the 5s). Here's how:

The AVCaptureDevice has its activeFormat set to full sensor resolution, then you grab and process 10 frames per second from AVCaptureVideoDataOutputSampleBufferDelegate's captureOutput:didOutputSampleBuffer:fromConnection:, firing off a shutter sound for each frame grab.

You'll need to have a fall-back (either lower-resolution images or a slower burst mode) for devices that don't support video at the full sensor-size resolution—and/or if you want to support anything older than iOS 7.x.

Note that you can't have multiple concurrent usage of captureStillImageAsynchronouslyFromConnection:completionHandler: without some extremely unexpected results. This is why you should call each iteration from the previous one's completionHandler (which, in essence, is what your semaphore is doing). Also, you may wish to switch from PNG as your file format for burst shots—it saves very slowly and requires a lot of system resource—stacking up 15 or 20 PNGs could cause you some serious grief!

*It's probably doing this, because it may, of course, be using a private API to achieve the same end result.

查看更多
太酷不给撩
3楼-- · 2020-05-24 06:18

Use this method for burst mode in iOS 8 and above:

- (void)captureStillImageBracketAsynchronouslyFromConnection:(AVCaptureConnection *)connection withSettingsArray:(NSArray *)settings completionHandler:(void (^)(CMSampleBufferRef sampleBuffer, AVCaptureBracketedStillImageSettings *stillImageSettings, NSError *error))handler NS_AVAILABLE_IOS(8_0);

Documentation

查看更多
登录 后发表回答