Camera with a full screen live preview,
previewLayer!.videoGravity = AVLayerVideoGravityResize
make an image ...
stillImageOutput?.captureStillImageAsynchronously(
from: videoConnection, completionHandler:
the full-screen live preview will or should precisely match the still image.
(For clarity: say you accidentally use AVLayerVideoGravityResizeAspectFill
. In that case the live preview will NOT match the still image - you'll see a "jump" as it is stretched.)
However...
If you try the below (so using AVLayerVideoGravityResize
- the correct choice) with iOS10...
it does not precisely work: you get A SMALL JUMP between the live preview, and, the still image. One or the other is slightly stretched incorrectly.
Could this actually just be a bug with some devices? or in iOS?
(It works perfectly - no jump - on old devices, and if you try it with iOS9.)
Has anyone else seen this?
// CameraPlane ... the actual live camera plane per se
import UIKit
import AVFoundation
class CameraPlane:UIViewController
{
var captureSession: AVCaptureSession?
var stillImageOutput: AVCaptureStillImageOutput?
var previewLayer: AVCaptureVideoPreviewLayer?
fileprivate func fixConnectionOrientation()
{
if let connection = self.previewLayer?.connection
{
let previewLayerConnection : AVCaptureConnection = connection
guard previewLayerConnection.isVideoOrientationSupported else
{
print("strangely no orientation support")
return
}
previewLayerConnection.videoOrientation = neededVideoOrientation()
previewLayer!.frame = view.bounds
}
}
func neededVideoOrientation()->(AVCaptureVideoOrientation)
{
let currentDevice:UIDevice = UIDevice.current
let orientation: UIDeviceOrientation = currentDevice.orientation
var r:AVCaptureVideoOrientation
switch (orientation)
{
case .portrait: r = .portrait
break
case .landscapeRight: r = .landscapeLeft
break
case .landscapeLeft: r = .landscapeRight
break
case .portraitUpsideDown: r = .portraitUpsideDown
break
default: r = .portrait
break
}
return r
}
override func viewDidLayoutSubviews()
{
super.viewDidLayoutSubviews()
fixConnectionOrientation()
}
func cameraBegin()
{
captureSession = AVCaptureSession()
captureSession!.sessionPreset = AVCaptureSessionPresetPhoto
// remember that of course, none of this will work on a simulator, only on a device
let backCamera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
var error: NSError?
var input: AVCaptureDeviceInput!
do {
input = try AVCaptureDeviceInput(device: backCamera)
} catch let error1 as NSError
{
error = error1
input = nil
}
if ( error != nil )
{
print("probably on simulator? no camera?")
return;
}
if ( captureSession!.canAddInput(input) == false )
{
print("capture session problem?")
return;
}
captureSession!.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput!.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if ( captureSession!.canAddOutput(stillImageOutput) == false )
{
print("capture session with stillImageOutput problem?")
return;
}
captureSession!.addOutput(stillImageOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
// previewLayer!.videoGravity = AVLayerVideoGravityResizeAspect
// means, won't reach the top and bottom on devices, gray bars
// previewLayer!.videoGravity = AVLayerVideoGravityResizeAspectFill
// means, you get the "large squeeze" once you make photo
previewLayer!.videoGravity = AVLayerVideoGravityResize
// works perfectly on ios9, older devices etc.
// on 6s+, you get a small jump between the video live preview and the make photo
fixConnectionOrientation()
view.layer.addSublayer(previewLayer!)
captureSession!.startRunning()
previewLayer!.frame = view.bounds
}
/*Video Gravity.
These string constants define how the video is displayed within a layer’s bounds rectangle.
You use these constants when setting the videoGravity property of an AVPlayerLayer or AVCaptureVideoPreviewLayer instance.
AVLayerVideoGravityResize
Specifies that the video should be stretched to fill the layer’s bounds.
AVLayerVideoGravityResizeAspect
Specifies that the player should preserve the video’s aspect ratio and fit the video within the layer’s bounds.
AVLayerVideoGravityResizeAspectFill
Specifies that the player should preserve the video’s aspect ratio and fill the layer’s bounds.
*/
func makePhotoOn(_ here:UIImageView)
{
// recall that this indeed makes a still image, which is used as
// a new background image (indeed on the "stillImage" view)
// and you can then continue to move the door around on that scene.
if ( stillImageOutput == nil )
{
print("simulator, using test image.")
here.image = UIImage(named:"ProductMouldings.jpg")
return
}
guard let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo)
else
{
print("AVMediaTypeVideo didn't work?")
return
}
videoConnection.videoOrientation = (previewLayer!.connection?.videoOrientation)!
stillImageOutput?.captureStillImageAsynchronously(
from: videoConnection, completionHandler:
{
(sampleBuffer, error) in
guard sampleBuffer != nil else
{
print("sample buffer woe?")
return
}
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProvider(data: imageData as! CFData)
let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: CGColorRenderingIntent.defaultIntent)
let ort = self.neededImageOrientation()
let image = UIImage(cgImage:cgImageRef!, scale:1.0, orientation:ort)
here.image = image
})
}
func neededImageOrientation()->(UIImageOrientation)
{
var n : UIImageOrientation
let currentDevice: UIDevice = UIDevice.current
let orientation: UIDeviceOrientation = currentDevice.orientation
switch orientation
{
case UIDeviceOrientation.portraitUpsideDown:
n = .left
case UIDeviceOrientation.landscapeRight:
n = .down
case UIDeviceOrientation.landscapeLeft:
n = .up
case UIDeviceOrientation.portrait:
n = .right
default:
n = .right
}
return n
}
/*
@IBAction func didPressTakeAnother(sender: AnyObject)
{ captureSession!.startRunning() }
*/
}
Sanity check - are you sure
AVLayerVideoGravityResize
is the one you want to use? That's going to stretch the image (not preserving aspect ratio) to the frame of the preview. If you're intending to maintain aspect ratio you either wantAVLayerVideoGravityResizeAspect
(as you observed, there will be grey bars, but aspect ratio will be maintained) orAVLayerVideoGravityResizeAspectFill
(probably what you want - part of the preview will be cut off, but aspect ratio will be maintained).Assuming your 'here' view (the one passed to
makePhotoOn:
) is the same size/position as your preview view, you'll want to set 'here' view'scontentMode
to match the behavior of your preview.So if you used
AVLayerVideoGravityResizeAspect
for the preview, then:here.contentMode = .scaleAspectFit
If you used
AVLayerVideoGravityResizeAspectFill
for the preview, then:here.contentMode = .scaleAspectFill
.The default
contentMode
of a view is.scaleToFill
(noted here: https://developer.apple.com/reference/uikit/uiview/1622619-contentmode) so your 'here' imageView is probably stretching the image to match its size, not maintaining aspect ratio.If that doesn't help, you might consider providing a barebones project that exhibits the problem on github so that tinkerers among us on SO can quickly build and tinker with it.