AVCaptureSession with multiple Outputs?

2020-02-26 03:42发布

I'm currently developing an iOS app that applies CoreImage to the camera feed in order to take photos and videos, and I've run into a bit of snag.

Up till now I've been using AVCaptureVideoDataOutput to obtain the sample buffers and manipulate them with CoreImage, and then displayed a simple preview, as well as using it to capture photos and saving them.

When I tried to implement Video Recording, by writing the SampleBuffers to a video as I received them from the AVCaptureVideoDataOutput, it had a very slow frame rate (probably because of the other image relating processing that was going on).

So I was wondering, is it possible to have an AVCaptureVideoDataOutput and a AVCaptureMoveFileOutput going on the same AVCaptureSession simultaneously?

I gave it a quick go, and found that when I added the extra output, my AVCaptureVideoDataOutput stopped receiving information.

If I can get it working, I'm hoping it means that I can simply use the 2nd output to record video at high frame rates, and do post-processing on the video after the user has stopped recording.

Any help will be greatly appreciated.

1条回答
SAY GOODBYE
2楼-- · 2020-02-26 04:27

It's easier than you'd think.

See: AVCamDemo

  1. Capture data using AVCaptureVideoDataOutput.
  2. Create a new dispatch queue before recording, eg. recordingQueue: recordingQueue = dispatch_queue_create("Movie Recording Queue", DISPATCH_QUEUE_SERIAL);
  3. In the captureOutput:didOutputSampleBuffer:fromConnection: delegate method, capture the samplebuffer, retain it, and in the recording queue, write it to the file:

    -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {    
    
        CFRetain(sampleBuffer);
    
        dispatch_async(recordingQueue, ^{
    
            if (assetWriter) {
    
                if (connection == videoConnection) {
                    [self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeVideo];
                } else if (connection == audioConnection) {
                    [self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeAudio];
                }
    
            }
    
            CFRelease(sampleBuffer);        
        });
    }
    
        - (void) writeSampleBuffer:(CMSampleBufferRef)sampleBuffer ofType:(NSString *)mediaType
        {
            CMTime presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    
            if ( assetWriter.status == AVAssetWriterStatusUnknown ) {
    
                if ([assetWriter startWriting]) {
                    [assetWriter startSessionAtSourceTime:presentationTime];
                } else {
                    NSLog(@"Error writing initial buffer");
                }
            }
    
            if ( assetWriter.status == AVAssetWriterStatusWriting ) {
    
                if (mediaType == AVMediaTypeVideo) {
                    if (assetWriterVideoIn.readyForMoreMediaData) {
    
                        if (![assetWriterVideoIn appendSampleBuffer:sampleBuffer]) {
                            NSLog(@"Error writing video buffer");
                        }
                    }
                }
                else if (mediaType == AVMediaTypeAudio) {
                    if (assetWriterAudioIn.readyForMoreMediaData) {
    
                        if (![assetWriterAudioIn appendSampleBuffer:sampleBuffer]) {
                            NSLog(@"Error writing audio buffer");
                        }
                    }
                }
            }
        }
    
查看更多
登录 后发表回答