I have the following code which works for iOS 6 & 7.x.
In iOS 8.1 I have a strange issue where if you capture a session for about 13 seconds or longer, the resulting AVAsset only has 1 track (video), the audio track is just not there.
If you record for a shorter period the AVAsset has 2 tracks (video and audio) as expected. I have plenty of disk space, the app has permission to use camera and microphone.
I created a new project with minimal code, it reproduced the issue.
Any ideas would be greatly appreciated.
#import "ViewController.h"
@interface ViewController ()
@end
@implementation ViewController
{
enum RecordingState { Recording, Stopped };
enum RecordingState recordingState;
AVCaptureSession *session;
AVCaptureMovieFileOutput *output;
AVPlayer *player;
AVPlayerLayer *playerLayer;
bool audioGranted;
}
- (void)viewDidLoad {
[super viewDidLoad];
[self setupAV];
recordingState = Stopped;
}
-(void)setupAV
{
session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
AVCaptureDevice *videoDevice = nil;
for ( AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] ) {
if ( device.position == AVCaptureDevicePositionBack ) {
videoDevice = device;
break;
}
}
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
if (videoDevice && audioDevice)
{
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
[session addInput:input];
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
[session addInput:audioInput];
NSURL *recordURL = [self tempUrlForRecording];
[[NSFileManager defaultManager] removeItemAtURL:recordURL error:nil];
output= [[AVCaptureMovieFileOutput alloc] init];
output.maxRecordedDuration = CMTimeMake(45, 1);
output.maxRecordedFileSize = 1028 * 1028 * 1000;
[session addOutput:output];
}
[session commitConfiguration];
}
- (IBAction)recordingButtonClicked:(id)sender {
if(recordingState == Stopped)
{
[self startRecording];
}
else
{
[self stopRecording];
}
}
-(void)startRecording
{
recordingState = Recording;
[session startRunning];
[output startRecordingToOutputFileURL:[self tempUrlForRecording] recordingDelegate:self];
}
-(void)stopRecording
{
recordingState = Stopped;
[output stopRecording];
[session stopRunning];
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
AVAsset *cameraInput = [AVAsset assetWithURL:[self tempUrlForRecording]];
//DEPENDING ON HOW LONG RECORDED THIS DIFFERS (<14 SECS - 2 Tracks, >14 SECS - 1 Track)
NSLog(@"Number of tracks: %i", cameraInput.tracks.count);
}
-(id)tempUrlForRecording
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [paths objectAtIndex:0];
NSString *path = @"camerabuffer.mp4";
NSString *pathCameraInput =[documentsDirectoryPath stringByAppendingPathComponent: path];
NSURL *urlCameraInput = [NSURL fileURLWithPath:pathCameraInput];
return urlCameraInput;
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
@end
I had this issue and the way to fix this in Swift 4 is the following:
Do not set movieFileOutput.maxRecordedDuration. There seems to be a bug with this where if you set this then if you are recording videos for longer than 12-13 seconds they will have no audio.
Instead use a timer to stop the recording and set movieFragmentInterval like this:
Here is a whole block of code just to show you how I did it:
This will help you to fix it.
[movieOutput setMovieFragmentInterval:kCMTimeInvalid];
I think this is a bug. The documentation says the sample table is not written if the recording does not complete successfully. So it will automatically be written if it does complete successfully. But now it seems like it doesn't.
Any ideas?