-->

Why AVCaptureSession output a wrong orientation?

2019-01-21 04:45发布

问题:

So, I followed Apple's instructions to capture video session using AVCaptureSession: http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html. One problem I'm facing is that even though the orientation of the camera / iPhone device is vertical (and the AVCaptureVideoPreviewLayer shows a vertical camera stream), the output image seems to be in the landscape mode. I checked the width and height of imageBuffer inside imageFromSampleBuffer: of the sample code, and I got 640px and 480px respectively. Does anyone know why this's the case?

Thanks!

回答1:

Take a look at the header AVCaptureSession.h. There is a definition for an enum called AVCaptureVideoOrientation that defines various video orientations. On the AVCaptureConnection object there is a property called videoOrientation that is a AVCaptureVideoOrientation. You should be able to set this to change the orientation of the video. You probably want AVCaptureVideoOrientationLandscapeRight or AVCaptureVideoOrientationLandscapeLeft.

You can find the AVCaptureConnections for the session by looking at the outputs for the session. The outputs have a connections property that is an array of connections for that output.



回答2:

I made a simple one-line modification to the imageFromSampleBuffer to correct the orientation problem (see my comment in the code under "I modified ..."). Hope it helps someone because I spent too much time on this.

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer  {
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context1 = CGBitmapContextCreate(baseAddress, width, height, 8, 
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);

    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context1); 
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context1); 
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    //I modified this line: [UIImage imageWithCGImage:quartzImage]; to the following to correct the orientation:
    UIImage *image =  [UIImage imageWithCGImage:quartzImage scale:1.0 orientation:UIImageOrientationRight]; 

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}


回答3:

Y'all are making this difficult.

In the DidOutputSampleBuffer, simply change the orientation before you grab the image. It's mono, but you have

    public class OutputRecorder : AVCaptureVideoDataOutputSampleBufferDelegate {    
        public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
        {
            try {
                connection.videoOrientation = AVCaptureVideoOrientation.LandscapeLeft;

in objC it's this method

- ( void ) captureOutput: ( AVCaptureOutput * ) captureOutput
   didOutputSampleBuffer: ( CMSampleBufferRef ) sampleBuffer
      fromConnection: ( AVCaptureConnection * ) connection


回答4:

Here is a right sequence:

AVCaptureVideoDataOutput *videoCaptureOutput = [[AVCaptureVideoDataOutput alloc] init];

if([self.captureSession canAddOutput:self.videoCaptureOutput]){
    [self.captureSession addOutput:self.videoCaptureOutput];
}else{
    NSLog(@"cantAddOutput");
}

// set portrait orientation
AVCaptureConnection *conn = [self.videoCaptureOutput connectionWithMediaType:AVMediaTypeVideo];
[conn setVideoOrientation:AVCaptureVideoOrientationPortrait];


回答5:

For instance:

AVCaptureConnection *captureConnection = <a capture connection>;
if ([captureConnection isVideoOrientationSupported]) {
    captureConnection.videoOrientation = AVCaptureVideoOrientationPortrait;
}

The default appears to be AVCaptureVideoOrientationLandscapeRight.

See also QA1744: Setting the orientation of video with AV Foundation.



回答6:

For those people that need to work with CIImage and orientation from buffer is wrong I used this correction.

As easy as that. BTW the numbers 3,1,6,8 are from here https://developer.apple.com/reference/imageio/kcgimagepropertyorientation

And don't ask me why 3,1,6,8 is the right combination. I used brute-force method to find it. If you know why let the explanation in a comment please...

- (void)captureOutput:(AVCaptureOutput *)captureOutput
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
           fromConnection:(AVCaptureConnection *)connection
{

    // common way to get CIImage

    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate);

    CIImage *ciImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer
                                                      options:(__bridge NSDictionary *)attachments];

    if (attachments) {
       CFRelease(attachments);
    }

    // fixing the orientation of the CIImage

    UIInterfaceOrientation curOrientation = [[UIApplication sharedApplication] statusBarOrientation];

    if (curOrientation == UIInterfaceOrientationLandscapeLeft){
        ciImage = [ciImage imageByApplyingOrientation:3];
    } else if (curOrientation == UIInterfaceOrientationLandscapeRight){
        ciImage = [ciImage imageByApplyingOrientation:1];
    } else if (curOrientation == UIInterfaceOrientationPortrait){
        ciImage = [ciImage imageByApplyingOrientation:6];
    } else if (curOrientation == UIInterfaceOrientationPortraitUpsideDown){
        ciImage = [ciImage imageByApplyingOrientation:8];
    }



    // ....

}


回答7:

If the AVCaptureVideoPreviewLayer orientation is correct, you can simply set the orientation before you capture the image.

AVCaptureStillImageOutput *stillImageOutput;
AVCaptureVideoPreviewLayer *previewLayer;
NSData *capturedImageData;

AVCaptureConnection *videoConnection = [stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
if ([videoConnection isVideoOrientationSupported]) {
    [videoConnection setVideoOrientation:previewLayer.connection.videoOrientation];
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
    CFDictionaryRef exifAttachments =
            CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
    if (exifAttachments) {
        // Do something with the attachments.
    }
    // TODO need to manually add GPS data to the image captured
    capturedImageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
    UIImage *image = [UIImage imageWithData:capturedImageData];
}];

Also, it's important to note that UIImageOrientation and AVCaptureVideoOrientation are different. UIImageOrientationUp refers to landscape mode with the volume controls down toward the ground (not up if you think about using the volume controls as a shutter button).

Thus, portrait orientation with the power button pointing to the sky (AVCaptureVideoOrientationPortrait) is actually UIImageOrientationLeft.



回答8:

Orientation issue is with the front camera, so check device type and generate new image, it will definitely solve the orientation issue:

-(void)capture:(void(^)(UIImage *))handler{

AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.stillImageOutput.connections)
{
    for (AVCaptureInputPort *port in [connection inputPorts])
    {
        if ([[port mediaType] isEqual:AVMediaTypeVideo] )
        {
            videoConnection = connection;
            break;
        }
    }
    if (videoConnection) { break; }
}

[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {

    if (imageSampleBuffer != NULL) {
        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
        **UIImage *capturedImage = [UIImage imageWithData:imageData];
        if (self.captureDevice == [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo][1]) {
            capturedImage = [[UIImage alloc] initWithCGImage:capturedImage.CGImage scale:1.0f orientation:UIImageOrientationLeftMirrored];
        }**

        handler(capturedImage);
    }
}];
}


回答9:

First of all, in the configuration of your video output, put these lines:

guard let connection = videoOutput.connection(withMediaType: 
AVFoundation.AVMediaTypeVideo) else { return }
guard connection.isVideoOrientationSupported else { return }
guard connection.isVideoMirroringSupported else { return }
connection.videoOrientation = .portrait
connection.isVideoMirrored = position == .front

Then, configure your Target to support just Portait, by unchecking Landscape modes in General configuration.

(Source)



回答10:

You can try this:

private func startLiveVideo() {

    let captureSession = AVCaptureSession()
    captureSession.sessionPreset = .photo
    let captureDevice = AVCaptureDevice.default(for: .video)

    let input = try! AVCaptureDeviceInput(device: captureDevice!)
    let output = AVCaptureVideoDataOutput()
    captureSession.addInput(input)
    captureSession.addOutput(output)

    output.setSampleBufferDelegate(self, queue: DispatchQueue(label: "videoQueue"))
    output.connection(with: .video)?.videoOrientation = .portrait

    let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    previewLayer.frame = view.bounds
    view.layer.addSublayer(previewLayer)

    captureSession.startRunning()
}