Playing back sound coming from microphone in real-

2019-01-21 03:12发布

问题:

I've been trying to get my application recording the sound coming from the microphone and playing it back in (approximately) real-time, however without success.

I'm using AudioRecord and AudioTrack classes for record and playback, respectively. I've tried different approaches, I've tried to record the incoming sound and write it to a file and it worked fine. I've also tried to playback sound from that file AFTER with AudioTrack and it worked fine too. The problem is when I try to play the sound in real-time, instead of reading a file after it's written.

Here is the code:

//variables
private int audioSource = MediaRecorder.AudioSource.MIC;
private int samplingRate = 44100; /* in Hz*/
private int channelConfig = AudioFormat.CHANNEL_CONFIGURATION_MONO;
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
private int bufferSize = AudioRecord.getMinBufferSize(samplingRate, channelConfig, audioFormat);
private int sampleNumBits = 16;
private int numChannels = 1;

// …

AudioRecord recorder = new AudioRecord(audioSource, samplingRate, channelConfig, audioFormat, bufferSize);
                recorder.startRecording();
                isRecording = true;

AudioTrack audioPlayer = new AudioTrack(AudioManager.STREAM_MUSIC, 44100, AudioFormat.CHANNEL_CONFIGURATION_MONO,
                        AudioFormat.ENCODING_PCM_16BIT, bufferSize, AudioTrack.MODE_STREAM);

if(audioPlayer.getPlayState() != AudioTrack.PLAYSTATE_PLAYING)
    audioPlayer.play();

//capture data and record to file
int readBytes=0, writtenBytes=0;
do{
   readBytes = recorder.read(data, 0, bufferSize);

   if(AudioRecord.ERROR_INVALID_OPERATION != readBytes){
      writtenBytes += audioPlayer.write(data, 0, readBytes);
   }
}
while(isRecording);

It is thrown a java.lang.IllegalStateException with the reason being caused by "play() called on a uninitialized AudioTrack".

However, if I change the AudioTrack initialization for example to use sampling rate 8000Hz and sample format 8 bits (instead of 16), it doesn't throw the exception anymore and the application runs, although it produces horrible noise.

When I play AudioTrack from a file, there is no problem with the initialization of the AudioTrack, I tried 44100 and 16 bits and it worked properly, producing the correct sound.

Any help ?

回答1:

All native Android audio is encoded. You can only play out PCM formats in real time, or use a special streaming codec, which I don't think is trivial on Android.

The point is that if you want to record/play out audio simultaneously, you would have to create your own audio buffer and store raw PCM-encoded audio samples in there (I'm not sure if you're thinking duh! or whether this is all over your head, so I'll try to be clear but not to chew your own gum).

PCM is a digital representation of an analog signal in which your audio samples are a set of "snapshots" of the original acoustic wave. Because all kinds of clever mathematicians and engineers saw the potential in trying to reduce the number of bits you represent this data with, they came up with all sorts of encoders. The encoded (compressed) signal is represented very differently from the raw PCM signal and has to be decoded (en-cod-er+dec-oder = codec). Unless you're using special algorithms and media streaming codecs, it's impossible to play back an encoded signal like you're trying to, because it's not encoded sample by sample, but rather frame by frame, where you need the whole frame of samples, if not the complete signal, to decode this frame.

The way to do it is to manually store audio samples coming from the microphone buffer and manually feeding them to the output buffer. You will have to do some coding for that, but I believe there are some open-source apps that you can look at and take a peak at their source (unless you're willing to sell your app later on, of course, but that's a whole different discussion).

If you're developing for Android 2.3 or later and are not too scared of programming in native code, you can try using OpenSL ES. The Android-specific features of OpenSL ES are listed here. This platform allows you somewhat more flexible audio manipulation and you might find just what you need, if your app will be highly reliant on audio processing.



回答2:

It is thrown a java.lang.IllegalStateException with the reason being caused by "play() called on a uninitialized AudioTrack".

It is because the buffer size too small. I tried "bufferSize += 2048;", it's ok then.



回答3:

I had a similar problem and I solved it by adding this permission to the manifest file:

<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS"/>