Tap Mic Input Using AVAudioEngine in Swift

2019-02-01 02:15发布

I'm really excited about the new AVAudioEngine. It seems like a good API wrapper around audio unit. Unfortunately the documentation is so far nonexistent, and I'm having problems getting a simple graph to work.

Using the following simple code to set up an audio engine graph, the tap block is never called. It mimics some of the sample code floating around the web, though those also did not work.

let inputNode = audioEngine.inputNode
var error: NSError?
let bus = 0

inputNode.installTapOnBus(bus, bufferSize: 2048, format: inputNode.inputFormatForBus(bus)) { 
    (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in
    println("sfdljk")
}

audioEngine.prepare()
if audioEngine.startAndReturnError(&error) {
    println("started audio")
} else {
    if let engineStartError = error {
        println("error starting audio: \(engineStartError.localizedDescription)")
    }
}

All I'm looking for is the raw pcm buffer for analysis. I don't need any effects or output. According to the WWDC talk "502 Audio Engine in Practice", this setup should work.

Now if you want to capture data from the input node, you can install a node tap and we've talked about that.

But what's interesting about this particular example is, if I wanted to work with just the input node, say just capture data from the microphone and maybe examine it, analyze it in real time or maybe write it out to file, I can directly install a tap on the input node.

And the tap will do the work of pulling the input node for data, stuffing it in buffers and then returning that back to the application.

Once you have that data you can do whatever you need to do with it.

Here are some links I tried:

  1. http://hondrouthoughts.blogspot.com/2014/09/avfoundation-audio-monitoring.html
  2. http://jamiebullock.com/post/89243252529/live-coding-audio-with-swift-playgrounds (SIGABRT in playground on startAndReturnError)

Edit: This is the implementation based on Thorsten Karrer's suggestion. It unfortunately does not work.

class AudioProcessor {
    let audioEngine = AVAudioEngine()

    init(){
        let inputNode = audioEngine.inputNode
        let bus = 0
        var error: NSError?

        inputNode.installTapOnBus(bus, bufferSize: 2048, format:inputNode.inputFormatForBus(bus)) {
            (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in
                println("sfdljk")
        }

        audioEngine.prepare()
        audioEngine.startAndReturnError(nil)
        println("started audio")
    }
}

4条回答
SAY GOODBYE
2楼-- · 2019-02-01 02:51

It might be the case that your AVAudioEngine is going out of scope and is released by ARC ("If you liked it then you should have put retain on it...").

The following code (engine is moved to an ivar and thus sticks around) fires the tap:

class AppDelegate: NSObject, NSApplicationDelegate {

    let audioEngine  = AVAudioEngine()

    func applicationDidFinishLaunching(aNotification: NSNotification) {
        let inputNode = audioEngine.inputNode
        let bus = 0
        inputNode.installTapOnBus(bus, bufferSize: 2048, format: inputNode.inputFormatForBus(bus)) {
            (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in
            println("sfdljk")
        }

        audioEngine.prepare()
        audioEngine.startAndReturnError(nil)
    }
}

(I removed the error handling for brevity)

查看更多
我只想做你的唯一
3楼-- · 2019-02-01 02:54

nice topic

hi brodney

in your topic i find my solution . here is similar topic Generate AVAudioPCMBuffer with AVAudioRecorder

see lecture Wwdc 2014 502 - AVAudioEngine in Practice capture microphone => in 20 min create buffer with tap code => in 21 .50

here is swift 3 code

@IBAction func button01Pressed(_ sender: Any) {

    let inputNode = audioEngine.inputNode
    let bus = 0
    inputNode?.installTap(onBus: bus, bufferSize: 2048, format: inputNode?.inputFormat(forBus: bus)) {
        (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in

            var theLength = Int(buffer.frameLength)
            print("theLength = \(theLength)")

            var samplesAsDoubles:[Double] = []
            for i in 0 ..< Int(buffer.frameLength)
            {
                var theSample = Double((buffer.floatChannelData?.pointee[i])!)
                samplesAsDoubles.append( theSample )
            }

            print("samplesAsDoubles.count = \(samplesAsDoubles.count)")

    }

    audioEngine.prepare()
    try! audioEngine.start()

}

to stop audio

func stopAudio()
    {

        let inputNode = audioEngine.inputNode
        let bus = 0
        inputNode?.removeTap(onBus: bus)
        self.audioEngine.stop()

    }
查看更多
闹够了就滚
4楼-- · 2019-02-01 03:02

The above answer didn't work for me but the following did. I'm installing a tap on a mixer node.

        mMixerNode?.installTapOnBus(0, bufferSize: 4096, format: mMixerNode?.outputFormatForBus(0),
    {
        (buffer: AVAudioPCMBuffer!, time:AVAudioTime!) -> Void in
            NSLog("tapped")

    }
    )
查看更多
成全新的幸福
5楼-- · 2019-02-01 03:06

UPDATED: I have implemented a complete working example of Recording mic input, applying some effects (reverbs, delay, distortion) at runtime, and save all these effects to an output file.

var engine = AVAudioEngine()
var distortion = AVAudioUnitDistortion()
var reverb = AVAudioUnitReverb()
var audioBuffer = AVAudioPCMBuffer()
var outputFile = AVAudioFile()
var delay = AVAudioUnitDelay()

//Initialize the audio engine

func initializeAudioEngine() {

    engine.stop()
    engine.reset()
    engine = AVAudioEngine()

    isRealTime = true
    do {
        try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord)

        let ioBufferDuration = 128.0 / 44100.0

        try AVAudioSession.sharedInstance().setPreferredIOBufferDuration(ioBufferDuration)

    } catch {

        assertionFailure("AVAudioSession setup error: \(error)")
    }

    let fileUrl = URLFor("/NewRecording.caf")
    print(fileUrl)
    do {

        try outputFile = AVAudioFile(forWriting:  fileUrl!, settings: engine.mainMixerNode.outputFormatForBus(0).settings)
    }
    catch {

    }

    let input = engine.inputNode!
    let format = input.inputFormatForBus(0)

    //settings for reverb
    reverb.loadFactoryPreset(.MediumChamber)
    reverb.wetDryMix = 40 //0-100 range
    engine.attachNode(reverb)

    delay.delayTime = 0.2 // 0-2 range
    engine.attachNode(delay)

    //settings for distortion
    distortion.loadFactoryPreset(.DrumsBitBrush)
    distortion.wetDryMix = 20 //0-100 range
    engine.attachNode(distortion)


    engine.connect(input, to: reverb, format: format)
    engine.connect(reverb, to: distortion, format: format)
    engine.connect(distortion, to: delay, format: format)
    engine.connect(delay, to: engine.mainMixerNode, format: format)

    assert(engine.inputNode != nil)

    isReverbOn = false

    try! engine.start()
}

//Now the recording function:

func startRecording() {

    let mixer = engine.mainMixerNode
    let format = mixer.outputFormatForBus(0)

    mixer.installTapOnBus(0, bufferSize: 1024, format: format, block:
        { (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in

            print(NSString(string: "writing"))
            do{
                try self.outputFile.writeFromBuffer(buffer)
            }
            catch {
                print(NSString(string: "Write failed"));
            }
    })
}

func stopRecording() {

    engine.mainMixerNode.removeTapOnBus(0)
    engine.stop()
}

I hope this might help you. Thanks!

查看更多
登录 后发表回答