Camera image orientation

2019-01-18 06:58发布

问题:

In my app I'm capturing images using the Camera. These are being stored in an NSArray as an NSData representation. When I convert the NSData back to the image, the orientation is now landscape instead of portrait as I took it.

    NSData *data = UIImagePNGRepresentation([arrayImage objectAtIndex:0]);
    UIImage *tmp = [UIImage imageWithData:data];

Anyone have an explanation? Thanks.

回答1:

you should fix orientation of image captured by camera the code follows, by default the orientation of camera image is not correct

- (UIImage *)fixrotation:(UIImage *)image
{

    if (image.imageOrientation == UIImageOrientationUp) return image;
    CGAffineTransform transform = CGAffineTransformIdentity;

    switch (image.imageOrientation) {
        case UIImageOrientationDown:
        case UIImageOrientationDownMirrored:
            transform = CGAffineTransformTranslate(transform, image.size.width, image.size.height);
            transform = CGAffineTransformRotate(transform, M_PI);
            break;

        case UIImageOrientationLeft:
        case UIImageOrientationLeftMirrored:
            transform = CGAffineTransformTranslate(transform, image.size.width, 0);
            transform = CGAffineTransformRotate(transform, M_PI_2);
            break;

        case UIImageOrientationRight:
        case UIImageOrientationRightMirrored:
            transform = CGAffineTransformTranslate(transform, 0, image.size.height);
            transform = CGAffineTransformRotate(transform, -M_PI_2);
            break;
        case UIImageOrientationUp:
        case UIImageOrientationUpMirrored:
            break;
    }

    switch (image.imageOrientation) {
        case UIImageOrientationUpMirrored:
        case UIImageOrientationDownMirrored:
            transform = CGAffineTransformTranslate(transform, image.size.width, 0);
            transform = CGAffineTransformScale(transform, -1, 1);
            break;

        case UIImageOrientationLeftMirrored:
        case UIImageOrientationRightMirrored:
            transform = CGAffineTransformTranslate(transform, image.size.height, 0);
            transform = CGAffineTransformScale(transform, -1, 1);
            break;
        case UIImageOrientationUp:
        case UIImageOrientationDown:
        case UIImageOrientationLeft:
        case UIImageOrientationRight:
            break;
    }

    // Now we draw the underlying CGImage into a new context, applying the transform
    // calculated above.
    CGContextRef ctx = CGBitmapContextCreate(NULL, image.size.width, image.size.height,
                                             CGImageGetBitsPerComponent(image.CGImage), 0,
                                             CGImageGetColorSpace(image.CGImage),
                                             CGImageGetBitmapInfo(image.CGImage));
    CGContextConcatCTM(ctx, transform);
    switch (image.imageOrientation) {
        case UIImageOrientationLeft:
        case UIImageOrientationLeftMirrored:
        case UIImageOrientationRight:
        case UIImageOrientationRightMirrored:
            // Grr...
            CGContextDrawImage(ctx, CGRectMake(0,0,image.size.height,image.size.width), image.CGImage);
            break;

        default:
            CGContextDrawImage(ctx, CGRectMake(0,0,image.size.width,image.size.height), image.CGImage);
            break;
    }

    // And now we just create a new UIImage from the drawing context
    CGImageRef cgimg = CGBitmapContextCreateImage(ctx);
    UIImage *img = [UIImage imageWithCGImage:cgimg];
    CGContextRelease(ctx);
    CGImageRelease(cgimg);
    return img;
}


回答2:

I think this is a bug with the SDK. I ran into this exact problem and then switched over to UIImageJPEGRepresentation which fixed the problem.



回答3:

-[UIImage imageOrientation] might help :)

Image orientation affects the way the image data is displayed when drawn. By default, images are displayed in the “up” orientation. If the image has associated metadata (such as EXIF information), however, this property contains the orientation indicated by that metadata. For a list of possible values for this property, see “UIImageOrientation.”

Since that property is readonly, and depending on what you want to do, a possible (but ugly) solution could be:

UIImage *sourceImage = [arrayImage objectAtIndex:0];
NSData *data = UIImagePNGRepresentation(sourceImage);
UIImage *tmp = [UIImage imageWithData:data];
UIImage *fixed = [UIImage imageWithCGImage:tmp.CGImage
                                     scale:sourceImage.scale
                               orientation:sourceImage.imageOrientation];

(untested and there might/must be something cleaner)

EDIT : First part was an answer to your question, an explanation more than a fix.

This and this (old?) blog posts might be interesting readings for you. Strangely I've never met this issue while I'm using UIImageJPEGRepresentation to send images to a server... What iOS version are you working on? That could be an old SDK bug?



回答4:

//iOS Swift 3

func fixRotation(image: UIImage) -> UIImage
    {
        if (image.imageOrientation == .up)
        {
            return image
        }

        var transform: CGAffineTransform = .identity

        switch image.imageOrientation {
        case .down, .downMirrored:
            transform = transform.translatedBy(x: image.size.width, y: image.size.height)
            transform = transform.rotated(by: .pi)
        case .left, .leftMirrored:
            transform = transform.translatedBy(x: image.size.width, y: 0)
            transform = transform.rotated(by: .pi / 2)
        case .right, .rightMirrored:
            transform = transform.translatedBy(x: 0, y: image.size.height)
            transform = transform.rotated(by: -(.pi / 2))
        case .up, .upMirrored:
            break
        }

        switch image.imageOrientation {
        case .upMirrored, .downMirrored:
            transform = transform.translatedBy(x: image.size.width, y: 0)
            transform = transform.scaledBy(x: -1, y: 1)
        case .leftMirrored, .rightMirrored:
            transform = transform.translatedBy(x: image.size.height, y: 0)
            transform = transform.scaledBy(x: -1, y: 1)
        case .up, .down, .left, .right:
            break
        }

        // Now we draw the underlying CGImage into a new context, applying the transform
        // calculated above.

        let cgimage = image.cgImage
        let bitsPerComponent = cgimage?.bitsPerComponent
        let colorSpace = cgimage?.colorSpace
        let bitmapInfo = cgimage?.bitmapInfo

        let ctx = CGContext(data: nil, width: Int(image.size.width), height: Int(image.size.height), bitsPerComponent: bitsPerComponent! , bytesPerRow: 0, space: colorSpace!, bitmapInfo: bitmapInfo!.rawValue)

        ctx?.concatenate(transform)

        switch image.imageOrientation {
        case .left, .leftMirrored, .right, .rightMirrored:
            // Grr...
            ctx?.draw(image.cgImage!, in: CGRect(x: 0, y: 0, width: image.size.height, height: image.size.width))
        default:
            ctx?.draw(image.cgImage!, in: CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height))
        }

        let cgimg: CGImage = ctx!.makeImage()!
        let img = UIImage(cgImage: cgimg)
        return img
    }


回答5:

dispatch_async([self sessionQueue], ^{
        // Update the orientation on the still image output video connection before capturing.
        [[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];

        // Flash set to Auto for Still Capture
        [AVCamViewController setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];

        // Capture a still image.
        [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

            if (imageDataSampleBuffer)
            {
                NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                UIImage *image = [[UIImage alloc] initWithData:imageData];
                [[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
            }
        }];
    });

https://developer.apple.com/library/content/samplecode/AVCam/Introduction/Intro.html



回答6:

Here is a slightly modified Xamarin.iOS (C#) version of the top answer, so that you don't have to rewrite it by hand as well ;-) .

public static UIImage FixCameraImageRotation(UIImage image)
{
    if (image.Orientation == UIImageOrientation.Up) return image;

    var transform = CGAffineTransform.MakeIdentity();

    //handle orientation
    switch (image.Orientation)
    {
        case UIImageOrientation.Down:
        case UIImageOrientation.DownMirrored:
            transform.Rotate((float)Math.PI);
            transform.Translate(image.Size.Width, image.Size.Height);
            break;
        case UIImageOrientation.Left:
        case UIImageOrientation.LeftMirrored:
            transform.Rotate((float)Math.PI / 2);
            transform.Translate(image.Size.Width, 0);
            break;
        case UIImageOrientation.Right:
        case UIImageOrientation.RightMirrored:
            transform.Rotate(-(float)Math.PI / 2);
            transform.Translate(0, image.Size.Height);
            break;
    }

    //handle mirroring
    switch (image.Orientation)
    {
        case UIImageOrientation.UpMirrored:
        case UIImageOrientation.DownMirrored:
            transform.Translate(image.Size.Width, 0);
            transform.Scale(-1, 1);
            break;
        case UIImageOrientation.LeftMirrored:
        case UIImageOrientation.RightMirrored:
            transform.Translate(image.Size.Height, 0);
            transform.Scale(-1, 1);
            break;
    }

    //create context and apply transformation
    using (var context = new CGBitmapContext(
        IntPtr.Zero,
        (nint)image.Size.Width,
        (nint)image.Size.Height,
        image.CGImage.BitsPerComponent,
        image.CGImage.BytesPerRow,
        image.CGImage.ColorSpace,
        image.CGImage.BitmapInfo))
    {
        context.ConcatCTM(transform);

        //draw image
        switch (image.Orientation)
        {
            case UIImageOrientation.Left:
            case UIImageOrientation.LeftMirrored:
            case UIImageOrientation.Right:
            case UIImageOrientation.RightMirrored:
                context.DrawImage(new CGRect(0, 0, image.Size.Height, image.Size.Width), image.CGImage);
                break;
            default:
                context.DrawImage(new CGRect(0, 0, image.Size.Width, image.Size.Height), image.CGImage);
                break;
        }

        //convert result to UIImage
        using (var cgImage = context.ToImage())
        {
            var result = UIImage.FromImage(cgImage);
            return result;
        }
    }
}