I'm trying to capture multiple photos on highest resolution(AVCaptureSessionPresetPhoto) on iPhone 5s. I tried using the following code:
dispatch_semaphore_t sync = dispatch_semaphore_create(0);
while( [self isBurstModeEnabled] == YES )
{
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
if (imageSampleBuffer != NULL)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
NSString *videoThumbPath = [NSString
stringWithFormat:@"%@/img%d.png",
burstFolderPath,
index];
[imageData writeToFile:videoThumbPath atomically:YES];
if( 0 == index )
{
[self NSLogPrint:[NSString stringWithFormat:@"Created photo at %@",videoThumbPath]];
}
}
dispatch_semaphore_signal(sync);
}];
dispatch_semaphore_wait(sync, DISPATCH_TIME_FOREVER);
}
Using this code I can get about 2 photos per second, no way near the performance of burst mode of the native camera app. What am I doing wrong? Also I tried using the code above without the semaphore, but in that case I was having weird behavior, some photos were missing(img0.png img1.png img3.png would be present but img2.png would be missing). Using the second method the performance would be better but still not on par with the native app performance(in my tests the camera app would make about 8.4 photos per second).