I want to display the stream of the front and the back facing camera of an iPad2 in two UIViews next to each other.
To stream the image of one device I use the following code
AVCaptureDeviceInput *captureInputFront = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session addInput:captureInputFront];
session setSessionPreset:AVCaptureSessionPresetMedium];
session startRunning];
AVCaptureVideoPreviewLayer *prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
prevLayer.frame = self.view.frame;
[self.view.layer addSublayer:prevLayer];
which works fine for either camera.
To display the stream in parallel I tried to create another session, but as soon as the 2nd session is established the first freezes.
Then I tried to add two AVCaptureDeviceInput to the session but seems like at most one input is supported at the moment.
Any helpful ideas how to stream from both cameras?
It is possible to get CMSampleBufferRef
s from multiple video devices on MacOS X. You have to setup the AVCaptureConnection
objects manually. For example, assuming you have these objects:
AVCaptureSession *session;
AVCaptureInput *videoInput1;
AVCaptureInput *videoInput2;
AVCaptureVideoDataOutput *videoOutput1;
AVCaptureVideoDataOutput *videoOutput2;
Do NOT add the outputs like this:
[session addOutput:videoOutput1];
[session addOutput:videoOutput2];
Instead, add them and tell the session not to make any connections:
[session addOutputWithNoConnections:videoOutput1];
[session addOutputWithNoConnections:videoOutput2];
Then for each input/output pair make the connection from the input's video port to the output manually:
for (AVCaptureInputPort *port in [videoInput1 ports]) {
if ([[port mediaType] isEqualToString:AVMediaTypeVideo]) {
AVCaptureConnection* cxn = [AVCaptureConnection
connectionWithInputPorts:[NSArray arrayWithObject:port]
output:videoOutput1
];
if ([session canAddConnection:cxn]) {
[session addConnection:cxn];
}
break;
}
}
Finally, make sure to set sample buffer delegates for both outputs:
[videoOutput1 setSampleBufferDelegate:self queue:someDispatchQueue];
[videoOutput2 setSampleBufferDelegate:self queue:someDispatchQueue];
and now you should be able to process frames from both devices:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
if (captureOutput == videoOutput1)
{
// handle frames from first device
}
else if (captureOutput == videoOutput2)
{
// handle frames from second device
}
}
See also the AVVideoWall sample project for an example of combining live previews from multiple video devices.