What is the screen device called in iOS/Swift?
When I print the devices I get
(
"<AVCaptureFigVideoDevice: 0x134d0f210 [Back Camera][com.apple.avfoundation.avcapturedevice.built-in_video:0]>",
"<AVCaptureFigVideoDevice: 0x134e0af80 [Front Camera][com.apple.avfoundation.avcapturedevice.built-in_video:1]>",
"<AVCaptureFigAudioDevice: 0x174265440 [iPad Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>"
)
So where's the screen ID?
There's just too much outdated objective c code while swift is a moving target. I'm looking for a swift solution to capture video from my iPad screen and audio from built-in microphone. The audio will be a separate question.
Here is a screen grabber for OS X
https://github.com/kennyledet/SwiftCap
// AVCaptureSession holds inputs and outputs for real-time capture
let mSession = AVCaptureSession()
let mScreenCapOutput = AVCaptureMovieFileOutput()
var mOutputPath = ""
// Just capture main display for now
let mMainDisplayId = CGMainDisplayID()
but I cannot find in the documentation the display ID, CGMainDisplayID, for an iPad...
Here is a typical solution for a camera in swift
https://github.com/bradley/iOSSwiftSimpleAVCamera
but it has too many errors and doesn't compile with iOS 8.1 or 8.2 and grabs video from camera.
func addVideoOutput() {
var rgbOutputSettings: NSDictionary = NSDictionary(object: Int(CInt(kCIFormatRGBA8)), forKey: kCVPixelBufferPixelFormatTypeKey)
self.videoDeviceOutput = AVCaptureVideoDataOutput()
self.videoDeviceOutput.alwaysDiscardsLateVideoFrames = true
self.videoDeviceOutput.setSampleBufferDelegate(self, queue: self.sessionQueue)
if self.session.canAddOutput(self.videoDeviceOutput) {
self.session.addOutput(self.videoDeviceOutput)
}
}
Apple gives an objective-c solution like this
/*
* Create video connection
*/
AVCaptureDeviceInput *videoIn = [[AVCaptureDeviceInput alloc] initWithDevice:[self videoDeviceWithPosition:AVCaptureDevicePositionBack] error:nil];
if ([_captureSession canAddInput:videoIn])
[_captureSession addInput:videoIn];
AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
[videoOut setAlwaysDiscardsLateVideoFrames:YES];
[videoOut setVideoSettings:@{(id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]}];
dispatch_queue_t videoCaptureQueue = dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL);
[videoOut setSampleBufferDelegate:self queue:videoCaptureQueue];
if ([_captureSession canAddOutput:videoOut])
[_captureSession addOutput:videoOut];
_videoConnection = [videoOut connectionWithMediaType:AVMediaTypeVideo];
self.videoOrientation = _videoConnection.videoOrientation;
if([self.session canSetSessionPreset:AVCaptureSessionPreset640x480])
[self.session setSessionPreset:AVCaptureSessionPreset640x480]; // Lower video resolution to decrease recorded movie size
return YES;
}
This should be easy.....???
If you want to capture the video of screen and save it, there is also an option to make a number of screenshots, and later to convert array of images to the video, not very efficient from performance standpoint though, you probably won't have 30-60 fps, but if you are ok w/ 5-20 fps you might want to take a look at this example for swift3.
Here is a working copy of iOSSwiftSimpleAVCamera in swift. It doesn't quite solve your problem but it is a bit of a starting point for anyone else that looks at this thread. Some of the error checking was removed from this code so be weary, it will only work on an actual device not in the simulator.
App delegate
CameraSessionController
camera view controller