I am calling AVFoundation
's delegate method to handle a photo capture, but I am having difficulty converting the AVCapturePhoto
it generates into an UIImage
with the correct orientation. Although the routine below is successful, I always get a right-oriented UIImage
(UIImage.imageOrientation
= 3). I have no way of providing an orientation when using the UIImage(data: image)
and attempting to first use photo.cgImageRepresentation()?.takeRetainedValue()
also doesn't help. Please assist.
Image orientation is critical here as the resulting image is being fed to a Vision Framework workflow.
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
// capture image finished
print("Image captured.")
if let imageData = photo.fileDataRepresentation() {
if let uiImage = UIImage(data: imageData){
// do stuff to UIImage
}
}
}
UPDATE 1: Reading Apple's Photo Capture Programming Guide (out of date for iOS11), I did manage to find one thing I was doing wrong:
- On every capture call (
self.capturePhotoOutput.capturePhoto
) one must setup a connection with thePhotoOutput
object and update its orientation to match the device's orientation at the moment the picture is taken. For doing that, I created an extension ofUIDeviceOrientation
and used it on thesnapPhoto()
function I created to call the capture routine and wait for thedidFinishProcessingPhoto
delegate method to be executed. I've added a snapshot of the code because the code sample delimiters here don't seem to be displaying them correctly.
Update 2 Link to full project on GitHub: https://github.com/agu3rra/Out-Loud
Updated extension provided by Andre which works with Swift 4.2:
I've had success doing this:
It's based on what Apple mention in their docs:
To create our image with the right orientation we need to enter the correct
UIImage.Orientation
when we initialize the image.Its best to use the
CGImagePropertyOrientation
that comes back from the photoOutput delegate to get the exact orientation the camera session was in when the picture was taken. Only problem here is that while the enum values betweenUIImage.Orientation
andCGImagePropertyOrientation
are the same, the raw values are not. Apple suggests a simple mapping to fix this.https://developer.apple.com/documentation/imageio/cgimagepropertyorientation
Here is my implementation:
AVCapturePhotoCaptureDelegate
Extension for Mapping
Final update: I ran some experiments with the app and came to the following conclusions:
kCGImagePropertyOrientation
doesn’t seem to influence the orientation of the captured image inside your application and it only varies with the device orientation if you update yourphotoOutput
connection each time you are about to call thecapturePhoto
method. So:Viewing the generated images on the debugger has shown me how they get generated, so I could infer the required rotation (
UIImageOrientation
) to get it displayed upright. In other words: updatingUIImageOrientation
tells how the image should be rotated in order for you to see it in the correct orientation. So I came to the following table:I had to update my
UIDeviceOrientation
extension to a rather unintuitive form:This is how my final delegate method looks now. It displays the image in the expected orientation.
Inside the
AVCapturePhoto
I’m pretty sure you will find ametadata
object of the also calledCGImageProperties
.Inside it you will find the EXIF dictionary for orientation the next step is just to take the orientation and create an image according to that.
I do not have experiences in using
AVCapturePhotoOutput
but I have some using the old way.Pay attention that the EXIF dictionary is mapped differently in UIImageOrientation.
Here is an article I wrote a lot of time ago, but the main principle are still valid.
This question will point you to some implementations, it's pretty old too, I'm pretty sure that in the latest version they released easier API, but it will still guide you into take the issue.