I'm currently creating a simple application which uses AVFoundation to stream video into a UIImageView
.
To achieve this, I created an instance of AVCaptureSession()
and an AVCaptureSessionPreset()
:
let input = try AVCaptureDeviceInput(device: device)
print(input)
if (captureSession.canAddInput(input)) {
captureSession.addInput(input)
if (captureSession.canAddOutput(sessionOutput)) {
captureSession.addOutput(sessionOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.portrait
cameraView.layer.addSublayer(previewLayer)
captureSession.startRunning()
cameraView references to the UIImageView
outlet.
I now want to implement a way of capturing a still image from the AVCaptureSession.
Correct me if theres a more efficient way, but I plan to have an additional UIImageView
to hold the still image placed on top of the UIImageView
which holds the video?
I've created a button with action:
@IBAction func takePhoto(_sender: Any) {
// functionality to obtain still image
}
My issue is, I'm unsure how to actually obtain a still image from the capture session and populate the new UIImageView
with it.
After looking at information/questions posted on Stack, the majority of the solutions is to use:
captureStillImageAsynchronouslyFromConnection
I'm unsure if it's just Swift 3.0 but xCode isn't recognising this function.
Could someone please advise me on how to actually achieve the result of obtaining and displaying a still image upon button click.
Here is a link to my full code for better understanding of my program.
Thank you all in advance for taking the time to read my question and please feel free to tell me in case i've missed out some relevant data.
if you are targeting iOS 10 or above. captureStillImageAsynchronously(from:completionHandler:)
is deprecated along with AVCaptureStillImageOutput
.
As per the documentation
The AVCaptureStillImageOutput class is deprecated in iOS 10.0 and does
not support newer camera capture features such as RAW image output,
Live Photos, or wide-gamut color. In iOS 10.0 and later, use the
AVCapturePhotoOutput class instead. (The AVCaptureStillImageOutput
class remains supported in macOS 10.12.)
As per your code you are already using AVCapturePhotoOutput
. So just follow these below steps to take a photo from session. Same can be found here in Apple documentation.
- Create an AVCapturePhotoOutput object. Use its properties to determine supported capture settings and to enable certain features (for example, whether to capture Live Photos).
- Create and configure an AVCapturePhotoSettings object to choose features and settings for a specific capture (for example, whether to enable image stabilization or flash).
- Capture an image by passing your photo settings object to the capturePhoto(with:delegate:) method along with a delegate object implementing the AVCapturePhotoCaptureDelegate protocol. The photo capture output then calls your delegate to notify you of significant events during the capture process.
you are already doing step 1 and 2. So add this line in your code
@IBAction func takePhoto(_sender: Any) {
print("Taking Photo")
sessionOutput.capturePhoto(with: sessionOutputSetting, delegate: self as! AVCapturePhotoCaptureDelegate)
}
and implement the AVCapturePhotoCaptureDelegate
function
optional public func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?)
Note that this delegate will give lots of control over taking photos. Check out the documentation for more functions. Also you need to process the image data which means you have to convert the sample buffer to UIImage.
if sampleBuffer != nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProviderCreateWithCFData(imageData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
// ...
// Add the image to captureImageView here...
}
Note that the image you get is rotated left so we have to manually rotate right so get preview like image.
More info can be found in my previous SO answer