-->

AVAudioRecorder records only the audio after inter

2020-06-17 07:26发布

问题:

In my application for recording and playing audio using AVAudioRecorder and AVAudioPlayer I came across a scenario in the case of incoming phone call.While the recording is in progress and if the phone call comes,the audio recorded after the phone call is only recorded.I want the recording recorded after the phone call to be the continuation of the audio recorded before the phone call.

I track the interruption occuring in audio recorder using the AVAudioRecorderDelegate methods

  • (void)audioRecorderBeginInterruption:(AVAudioRecorder *)avRecorder and
  • (void)audioRecorderEndInterruption:(AVAudioRecorder *)avRecorder.

In my EndInterruption method I activates the audioSession.

Here is the recording code that I use

- (void)startRecordingProcess
{
    AVAudioSession *audioSession = [AVAudioSession sharedInstance];
    NSError *err = nil;
    [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&err];
    if(err)
    {
        DEBUG_LOG(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
        return;
    }
    [audioSession setActive:YES error:&err];
    err = nil;
    if(err)
    {
        DEBUG_LOG(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
        return;
    }
    // Record settings for recording the audio
    recordSetting = [[NSDictionary alloc] initWithObjectsAndKeys:
                     [NSNumber numberWithInt:kAudioFormatMPEG4AAC],AVFormatIDKey,
                     [NSNumber numberWithInt:44100],AVSampleRateKey,
                     [NSNumber numberWithInt: 2],AVNumberOfChannelsKey,
                     [NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
                     [NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
                     [NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
                     nil];
    BOOL fileExists = [[NSFileManager defaultManager] fileExistsAtPath:recorderFilePath];
    if (fileExists) 
    {        
        BOOL appendingFileExists = 
            [[NSFileManager defaultManager] fileExistsAtPath:appendingFilePath];
        if (appendingFileExists)
        {
            [[NSFileManager defaultManager]removeItemAtPath:appendingFilePath error:nil];
        }
        if (appendingFilePath) 
        {
            [appendingFilePath release];
            appendingFilePath = nil;
        }
        appendingFilePath = [[NSString alloc]initWithFormat:@"%@/AppendedAudio.m4a", DOCUMENTS_FOLDER];
        fileUrl = [NSURL fileURLWithPath:appendingFilePath]; 
    }
    else 
    {
        isFirstTime = YES;
        if (recorderFilePath) 
        {
            DEBUG_LOG(@"Testing 2");
            [recorderFilePath release];
            recorderFilePath = nil;
        }
        DEBUG_LOG(@"Testing 3");
        recorderFilePath = [[NSString alloc]initWithFormat:@"%@/RecordedAudio.m4a", DOCUMENTS_FOLDER];
        fileUrl = [NSURL fileURLWithPath:recorderFilePath];
    }
    err = nil;
    recorder = [[recorder initWithURL:fileUrl settings:recordSetting error:&err]retain];
    if(!recorder)
    {
        DEBUG_LOG(@"recorder: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
        [[AlertFunctions sharedInstance] showMessageWithTitle:kAppName 
                                                      message:[err localizedDescription] 
                                                     delegate:nil
                                            cancelButtonTitle:@"Ok"];
        return;
    }
    //prepare to record
    [recorder setDelegate:self];
    [recorder prepareToRecord];
    recorder.meteringEnabled = YES;
    [recorder record];

}

While searching for a solution to this issue I came across another link how to resume recording after interruption occured in iphone? and http://www.iphonedevsdk.com/forum/iphone-sdk-development/31268-avaudiorecorderdelegate-interruption.html which speaks of the same issue. I tried the suggestions that were given in those links but were not successful. I hope to make it work with AVAudioRecorder itself. Is there any way I could find a solution to this issue? All valuable suggestions are appreciated.

回答1:

After several research I was notified by Apple that it's an issue with the current API. So I managed to find a workaround for the issue by saving the previous audio file just after interruption and joining it with the resumed audio file. Hope it helps someone out there who may face the same issue.



回答2:

I was also facing a similar issue where AVAudioRecorder was recording only after interruption.
So i fixed this issue by maintaining an array of recordings and keeping them in the NSTemporaryDirectory and finally merging them at the end.

Below are the key steps:

  1. Make your class listen to the AVAudioSessionInterruptionNotification.
  2. On interruption begin (AVAudioSessionInterruptionTypeBegan), save your recording
  3. On interruption end(AVAudioSessionInterruptionTypeEnded), start a new recording for interruption option AVAudioSessionInterruptionOptionShouldResume
  4. Append all recordings on hitting the Save button.

The code snippets for the above mentioned steps are:

// 1. Make this class listen to the AVAudioSessionInterruptionNotification in viewDidLoad
- (void)viewDidLoad
{
    [super viewDidLoad];

    [[NSNotificationCenter defaultCenter] addObserver:self
                                             selector:@selector(handleAudioSessionInterruption:)
                                                 name:AVAudioSessionInterruptionNotification
                                               object:[AVAudioSession sharedInstance]];

    // other coding stuff
}

// observe the interruption begin / end 
- (void)handleAudioSessionInterruption:(NSNotification*)notification
{
    AVAudioSessionInterruptionType interruptionType = [notification.userInfo[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
    AVAudioSessionInterruptionOptions interruptionOption = [notification.userInfo[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue];

    switch (interruptionType) {
        // 2. save recording on interruption begin
        case AVAudioSessionInterruptionTypeBegan:{
            // stop recording
            // Update the UI accordingly
            break;
        }
        case AVAudioSessionInterruptionTypeEnded:{
            if (interruptionOption == AVAudioSessionInterruptionOptionShouldResume) {
                // create a new recording
                // Update the UI accordingly
            }
            break;
        }

        default:
            break;
    }
}  

// 4. append all recordings
- (void) audioRecorderDidFinishRecording:(AVAudioRecorder *)avrecorder successfully:(BOOL)flag
{
    // append all recordings one after other
}

Here is a working example:

//
//  XDRecordViewController.m
//
//  Created by S1LENT WARRIOR
//

#import "XDRecordViewController.h"

@interface XDRecordViewController ()
{
    AVAudioRecorder *recorder;

    __weak IBOutlet UIButton* btnRecord;
    __weak IBOutlet UIButton* btnSave;
    __weak IBOutlet UIButton* btnDiscard;
    __weak IBOutlet UILabel*  lblTimer; // a UILabel to display the recording time

    // some variables to display the timer on a lblTimer
    NSTimer* timer;
    NSTimeInterval intervalTimeElapsed;
    NSDate* pauseStart;
    NSDate* previousFireDate;
    NSDate* recordingStartDate;

    // interruption handling variables
    BOOL isInterrupted;
    NSInteger preInterruptionDuration;

    NSMutableArray* recordings; // an array of recordings to be merged in the end
}
@end

@implementation XDRecordViewController

- (void)viewDidLoad
{
    [super viewDidLoad];

    // Make this class listen to the AVAudioSessionInterruptionNotification
    [[NSNotificationCenter defaultCenter] addObserver:self
                                             selector:@selector(handleAudioSessionInterruption:)
                                                 name:AVAudioSessionInterruptionNotification
                                               object:[AVAudioSession sharedInstance]];

    [self clearContentsOfDirectory:NSTemporaryDirectory()]; // clear contents of NSTemporaryDirectory()

    recordings = [NSMutableArray new]; // initialize recordings

    [self setupAudioSession]; // setup the audio session. you may customize it according to your requirements
}

- (void)viewDidAppear:(BOOL)animated
{
    [super viewDidAppear:animated];

    [self initRecording];   // start recording as soon as the view appears
}

- (void)dealloc
{
    [self clearContentsOfDirectory:NSTemporaryDirectory()]; // remove all files files from NSTemporaryDirectory

    [[NSNotificationCenter defaultCenter] removeObserver:self]; // remove this class from NSNotificationCenter
}

#pragma mark - Event Listeners

// called when recording button is tapped
- (IBAction) btnRecordingTapped:(UIButton*)sender
{
    sender.selected = !sender.selected; // toggle the button

    if (sender.selected) { // resume recording
        [recorder record];
        [self resumeTimer];
    } else { // pause recording
        [recorder pause];
        [self pauseTimer];
    }
}

// called when save button is tapped
- (IBAction) btnSaveTapped:(UIButton*)sender
{
    [self pauseTimer]; // pause the timer

    // disable the UI while the recording is saving so that user may not press the save, record or discard button again
    btnSave.enabled = NO;
    btnRecord.enabled = NO;
    btnDiscard.enabled = NO;

    [recorder stop]; // stop the AVAudioRecorder so that the audioRecorderDidFinishRecording delegate function may get called

    // Deactivate the AVAudioSession
    NSError* error;
    [[AVAudioSession sharedInstance] setActive:NO error:&error];
    if (error) {
        NSLog(@"%@", error);
    }
}

// called when discard button is tapped
- (IBAction) btnDiscardTapped:(id)sender
{
    [self stopTimer]; // stop the timer

    recorder.delegate = Nil; // set delegate to Nil so that audioRecorderDidFinishRecording delegate function may not get called
    [recorder stop];  // stop the recorder

    // Deactivate the AVAudioSession
    NSError* error;
    [[AVAudioSession sharedInstance] setActive:NO error:&error];
    if (error) {
        NSLog(@"%@", error);
    }

    [self.navigationController popViewControllerAnimated:YES];
}

#pragma mark - Notification Listeners
// called when an AVAudioSessionInterruption occurs
- (void)handleAudioSessionInterruption:(NSNotification*)notification
{
    AVAudioSessionInterruptionType interruptionType = [notification.userInfo[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
    AVAudioSessionInterruptionOptions interruptionOption = [notification.userInfo[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue];

    switch (interruptionType) {
        case AVAudioSessionInterruptionTypeBegan:{
            // • Recording has stopped, already inactive
            // • Change state of UI, etc., to reflect non-recording state
            preInterruptionDuration += recorder.currentTime; // time elapsed
            if(btnRecord.selected) {    // timer is already running
                [self btnRecordingTapped:btnRecord];  // pause the recording and pause the timer
            }

            recorder.delegate = Nil; // Set delegate to nil so that audioRecorderDidFinishRecording may not get called
            [recorder stop];    // stop recording
            isInterrupted = YES;
            break;
        }
        case AVAudioSessionInterruptionTypeEnded:{
            // • Make session active
            // • Update user interface
            // • AVAudioSessionInterruptionOptionShouldResume option
            if (interruptionOption == AVAudioSessionInterruptionOptionShouldResume) {
                // Here you should create a new recording
                [self initRecording];   // create a new recording
                [self btnRecordingTapped:btnRecord];
            }
            break;
        }

        default:
            break;
    }
}

#pragma mark - AVAudioRecorderDelegate
- (void) audioRecorderDidFinishRecording:(AVAudioRecorder *)avrecorder successfully:(BOOL)flag
{
    [self appendAudiosAtURLs:recordings completion:^(BOOL success, NSURL *outputUrl) {
        // do whatever you want with the new audio file :)
    }];
}

#pragma mark - Timer
- (void)timerFired:(NSTimer*)timer
{
    intervalTimeElapsed++;
    [self updateDisplay];
}

// function to time string
- (NSString*) timerStringSinceTimeInterval:(NSTimeInterval)timeInterval
{
    NSDate *timerDate = [NSDate dateWithTimeIntervalSince1970:timeInterval];
    NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
    [dateFormatter setDateFormat:@"mm:ss"];
    [dateFormatter setTimeZone:[NSTimeZone timeZoneForSecondsFromGMT:0.0]];
    return [dateFormatter stringFromDate:timerDate];
}

// called when recording pauses
- (void) pauseTimer
{
    pauseStart = [NSDate dateWithTimeIntervalSinceNow:0];

    previousFireDate = [timer fireDate];

    [timer setFireDate:[NSDate distantFuture]];
}

- (void) resumeTimer
{
    if (!timer) {
        timer = [NSTimer scheduledTimerWithTimeInterval:1.0
                                                 target:self
                                               selector:@selector(timerFired:)
                                               userInfo:Nil
                                                repeats:YES];
        return;
    }

    float pauseTime = - 1 * [pauseStart timeIntervalSinceNow];

    [timer setFireDate:[previousFireDate initWithTimeInterval:pauseTime sinceDate:previousFireDate]];
}

- (void)stopTimer
{
    [self updateDisplay];
    [timer invalidate];
    timer = nil;
}

- (void)updateDisplay
{
    lblTimer.text = [self timerStringSinceTimeInterval:intervalTimeElapsed];
}

#pragma mark - Helper Functions
- (void) initRecording
{

    // Set the audio file
    NSString* name = [NSString stringWithFormat:@"recording_%@.m4a", @(recordings.count)]; // creating a unique name for each audio file
    NSURL *outputFileURL = [NSURL fileURLWithPathComponents:@[NSTemporaryDirectory(), name]];

    [recordings addObject:outputFileURL];

    // Define the recorder settings
    NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];

    [recordSetting setValue:@(kAudioFormatMPEG4AAC) forKey:AVFormatIDKey];
    [recordSetting setValue:@(44100.0) forKey:AVSampleRateKey];
    [recordSetting setValue:@(1) forKey:AVNumberOfChannelsKey];

    NSError* error;
    // Initiate and prepare the recorder
    recorder = [[AVAudioRecorder alloc] initWithURL:outputFileURL settings:recordSetting error:&error];
    recorder.delegate = self;
    recorder.meteringEnabled = YES;
    [recorder prepareToRecord];

    if (![AVAudioSession sharedInstance].inputAvailable) { // can not record audio if mic is unavailable
        NSLog(@"Error: Audio input device not available!");
        return;
    }

    intervalTimeElapsed = 0;
    recordingStartDate = [NSDate date];

    if (isInterrupted) {
        intervalTimeElapsed = preInterruptionDuration;
        isInterrupted = NO;
    }

    // Activate the AVAudioSession
    [[AVAudioSession sharedInstance] setActive:YES error:&error];
    if (error) {
        NSLog(@"%@", error);
    }

    recordingStartDate = [NSDate date];  // Set the recording start date
    [self btnRecordingTapped:btnRecord];
}

- (void)setupAudioSession
{

    static BOOL audioSessionSetup = NO;
    if (audioSessionSetup) {
        return;
    }

    AVAudioSession* session = [AVAudioSession sharedInstance];

    [session setCategory:AVAudioSessionCategoryPlayAndRecord
             withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker
                   error:Nil];

    [session setMode:AVAudioSessionModeSpokenAudio error:nil];

    audioSessionSetup = YES;
}

// gets an array of audios and append them to one another
// the basic logic was derived from here: http://stackoverflow.com/a/16040992/634958
// i modified this logic to append multiple files
- (void) appendAudiosAtURLs:(NSMutableArray*)urls completion:(void(^)(BOOL success, NSURL* outputUrl))handler
{
    // Create a new audio track we can append to
    AVMutableComposition* composition = [AVMutableComposition composition];
    AVMutableCompositionTrack* appendedAudioTrack =
    [composition addMutableTrackWithMediaType:AVMediaTypeAudio
                             preferredTrackID:kCMPersistentTrackID_Invalid];

    // Grab the first audio track that need to be appended
    AVURLAsset* originalAsset = [[AVURLAsset alloc]
                                 initWithURL:urls.firstObject options:nil];
    [urls removeObjectAtIndex:0];

    NSError* error = nil;

    // Grab the first audio track and insert it into our appendedAudioTrack
    AVAssetTrack *originalTrack = [[originalAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
    CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, originalAsset.duration);
    [appendedAudioTrack insertTimeRange:timeRange
                                ofTrack:originalTrack
                                 atTime:kCMTimeZero
                                  error:&error];
    CMTime duration = originalAsset.duration;

    if (error) {
        if (handler) {
            dispatch_async(dispatch_get_main_queue(), ^{
                handler(NO, Nil);
            });
        }
    }

    for (NSURL* audioUrl in urls) {
        AVURLAsset* newAsset = [[AVURLAsset alloc]
                                initWithURL:audioUrl options:nil];

        // Grab the rest of the audio tracks and insert them at the end of each other
        AVAssetTrack *newTrack = [[newAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
        timeRange = CMTimeRangeMake(kCMTimeZero, newAsset.duration);
        [appendedAudioTrack insertTimeRange:timeRange
                                    ofTrack:newTrack
                                     atTime:duration
                                      error:&error];

        duration = appendedAudioTrack.timeRange.duration;

        if (error) {
            if (handler) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    handler(NO, Nil);
                });
            }
        }
    }

    // Create a new audio file using the appendedAudioTrack
    AVAssetExportSession* exportSession = [AVAssetExportSession
                                           exportSessionWithAsset:composition
                                           presetName:AVAssetExportPresetAppleM4A];
    if (!exportSession) {
        if (handler) {
            dispatch_async(dispatch_get_main_queue(), ^{
                handler(NO, Nil);
            });
        }
    }

    NSArray* appendedAudioPath = @[NSTemporaryDirectory(), @"temp.m4a"]; // name of the final audio file
    exportSession.outputURL = [NSURL fileURLWithPathComponents:appendedAudioPath];
    exportSession.outputFileType = AVFileTypeAppleM4A;
    [exportSession exportAsynchronouslyWithCompletionHandler:^{

        BOOL success = NO;
        // exported successfully?
        switch (exportSession.status) {
            case AVAssetExportSessionStatusFailed:
                break;
            case AVAssetExportSessionStatusCompleted: {
                success = YES;

                break;
            }
            case AVAssetExportSessionStatusWaiting:
                break;
            default:
                break;
        }

        if (handler) {
            dispatch_async(dispatch_get_main_queue(), ^{
                handler(success, exportSession.outputURL);
            });
        }
    }];
}

- (void) clearContentsOfDirectory:(NSString*)directory
{
    NSFileManager *fm = [NSFileManager defaultManager];
    NSError *error = nil;
    for (NSString *file in [fm contentsOfDirectoryAtPath:directory error:&error]) {
        [fm removeItemAtURL:[NSURL fileURLWithPathComponents:@[directory, file]] error:&error];
    }
}

@end

I know its too late to answer to question, but hope this helps someone else!