I an developing an android app. I want to accomplish below feature. I will use my phone's built-in mic to record and at the same time i want the recorded audio to be played through either phone's speakers or headphones. Is it feasible? If yes, please help me in this.
问题:
回答1:
Here is a simple Recording and Playback application.
Uses Android AudioRecord and AudioTrack,
Design:
The recorded audio is written to a buffer and played back from the same buffer, This mechanism runs in a loop (using Android thread) controlled by buttons.
Code
private String TAG = "AUDIO_RECORD_PLAYBACK";
private boolean isRunning = true;
private Thread m_thread; /* Thread for running the Loop */
private AudioRecord recorder = null;
private AudioTrack track = null;
int bufferSize = 320; /* Buffer for recording data */
byte buffer[] = new byte[bufferSize];
/* Method to Enable/Disable Buttons */
private void enableButton(int id,boolean isEnable){
((Button)findViewById(id)).setEnabled(isEnable);
}
The GUI has two Buttons START and STOP.
Enable the Button:
enableButton(R.id.StartButton,true);
enableButton(R.id.StopButton,false);
/* Assign Button Click Handlers */
((Button)findViewById(R.id.StartButton)).setOnClickListener(btnClick);
((Button)findViewById(R.id.StopButton)).setOnClickListener(btnClick);
Mapping START and STOP Button for OnClickListener
private View.OnClickListener btnClick = new View.OnClickListener() {
@Override
public void onClick(View v) {
switch(v.getId()){
case R.id.StartButton:
{
Log.d(TAG, "======== Start Button Pressed ==========");
isRunning = true;
do_loopback(isRunning);
enableButton(R.id.StartButton,false);
enableButton(R.id.StopButton,true);
break;
}
case R.id.StopButton:
{
Log.d(TAG, "======== Stop Button Pressed ==========");
isRunning = false;
do_loopback(isRunning);
enableButton(R.id.StopButton,false);
enableButton(R.id.StartButton,true);
break;
}
}
}
Start the Thread:
private void do_loopback(final boolean flag)
{
m_thread = new Thread(new Runnable() {
public void run() {
run_loop(flag);
}
});
m_thread.start();
}
Method for Initializing AudioRecord and AudioTrack:
public AudioTrack findAudioTrack (AudioTrack track)
{
Log.d(TAG, "===== Initializing AudioTrack API ====");
int m_bufferSize = AudioTrack.getMinBufferSize(8000,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT);
if (m_bufferSize != AudioTrack.ERROR_BAD_VALUE)
{
track = new AudioTrack(AudioManager.STREAM_MUSIC, 8000,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, m_bufferSize,
AudioTrack.MODE_STREAM);
if (track.getState() == AudioTrack.STATE_UNINITIALIZED) {
Log.e(TAG, "===== AudioTrack Uninitialized =====");
return null;
}
}
return track;
}
public AudioRecord findAudioRecord (AudioRecord recorder)
{
Log.d(TAG, "===== Initializing AudioRecord API =====");
int m_bufferSize = AudioRecord.getMinBufferSize(8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
if (m_bufferSize != AudioRecord.ERROR_BAD_VALUE)
{
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC, 8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, m_bufferSize);
if (recorder.getState() == AudioRecord.STATE_UNINITIALIZED) {
Log.e(TAG, "====== AudioRecord UnInitilaised ====== ");
return null;
}
}
return recorder;
}
The Values for findAudioRecord
or findAudioTrack
can change based on device.
Please refer this question.
Code for Running the loop:
public void run_loop (boolean isRunning)
{
/** == If Stop Button is pressed == **/
if (isRunning == false) {
Log.d(TAG, "===== Stop Button is pressed ===== ");
if (AudioRecord.STATE_INITIALIZED == recorder.getState()){
recorder.stop();
recorder.release();
}
if (AudioTrack.STATE_INITIALIZED == track.getState()){
track.stop();
track.release();
}
return;
}
/** ======= Initialize AudioRecord and AudioTrack ======== **/
recorder = findAudioRecord(recorder);
if (recorder == null) {
Log.e(TAG, "======== findAudioRecord : Returned Error! =========== ");
return;
}
track = findAudioTrack(track);
if (track == null) {
Log.e(TAG, "======== findAudioTrack : Returned Error! ========== ");
return;
}
if ((AudioRecord.STATE_INITIALIZED == recorder.getState()) &&
(AudioTrack.STATE_INITIALIZED == track.getState()))
{
recorder.startRecording();
Log.d(TAG, "========= Recorder Started... =========");
track.play();
Log.d(TAG, "========= Track Started... =========");
}
else
{
Log.d(TAG, "==== Initilazation failed for AudioRecord or AudioTrack =====");
return;
}
/** ------------------------------------------------------ **/
/* Recording and Playing in chunks of 320 bytes */
bufferSize = 320;
while (isRunning == true)
{
/* Read & Write to the Device */
recorder.read(buffer, 0, bufferSize);
track.write(buffer, 0, bufferSize);
}
Log.i(TAG, "Loopback exit");
return;
}
Please include the following in AndroidManifest.xml
<uses-permission android:name="android.permission.RECORD_AUDIO" > </uses-permission>
This above procedure is also possible by Writing/Reading from a file using the same APIs.
Why use audioRecord
over mediaRecorder
- See here.
The Code is tested (on Google Nexus 5) and working perfectly.
Note: Please add some error-checking code for recorder.read
and track.write
, in case you fail. Same applies for findAudioRecord
and findAudioTrack
.
回答2:
First create objects in onCreate method, MediaRecorder class object and the path to file where you want to save the recorded data.
String outputFile = Environment.getExternalStorageDirectory().
getAbsolutePath() + "/myrecording.3gp"; // Define outputFile outside onCreate method
MediaRecorder myAudioRecorder = new MediaRecorder(); // Define this outside onCreate method
myAudioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
myAudioRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
myAudioRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
myAudioRecorder.setOutputFile(outputFile);
These three function you can call it on any button, in order to play Rec, stop Rec and start Rec;
public void start(View view){
try {
myAudioRecorder.prepare();
myAudioRecorder.start();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
start.setEnabled(false);
stop.setEnabled(true);
Toast.makeText(getApplicationContext(), "Recording started", Toast.LENGTH_LONG).show();
}
public void stop(View view){
myAudioRecorder.stop();
myAudioRecorder.release();
myAudioRecorder = null;
stop.setEnabled(false);
play.setEnabled(true);
Toast.makeText(getApplicationContext(), "Audio recorded successfully",
Toast.LENGTH_LONG).show();
}
public void play(View view) throws IllegalArgumentException,
SecurityException, IllegalStateException, IOException{
MediaPlayer m = new MediaPlayer();
m.setDataSource(outputFile);
m.prepare();
m.start();
Toast.makeText(getApplicationContext(), "Playing audio", Toast.LENGTH_LONG).show();
}
回答3:
As I read Developer document here , Android supports RTSP protocol (for real time streaming) and also HTTP/HTTPS live streaming draft protocol.
There is also an example here. You must have base knowledge about Streaming server, like Red5 or Wowza.