I would like to create a video recorder and so far haven't figured out how to set parameters in order to successfully go through MediaRecorder.prepare() method.
Executing the following method
public void start() throws IOException{
String state = android.os.Environment.getExternalStorageState();
if(!state.equals(Environment.MEDIA_MOUNTED))
{
throw new IOException("SD card is not mounted. It is " + state + ".");
}
File directory = new File(path).getParentFile();
if(!directory.exists() && !directory.mkdirs())
{
throw new IOException("Path to file could not be created.");
}
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.H263);
recorder.setVideoFrameRate(15);
recorder.setVideoSize(176, 144);
recorder.setOutputFile(path);
recorder.prepare();
recorder.start();
this.state = VideoRecorderState.STATE_RECORDING;
}
it throws an exception on line recorder.prepare().
How to set parameters in order to be able to capture video?
Here is a snippet that works:
m_recorder = new MediaRecorder();
m_recorder.setPreviewDisplay(m_BeMeSurface);
m_recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
m_recorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
m_recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
m_recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
m_recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
m_recorder.setMaxDuration((int) MAX_TIME);
m_recorder.setOnInfoListener(m_BeMeSelf);
m_recorder.setVideoSize(320, 240);
m_recorder.setVideoFrameRate(15);
m_recorder.setOutputFile(m_path);
m_recorder.prepare();
m_recorder.start();
THE most important thing is the surface. You don't have it, so without it it fails.
Regards
BeMeCollective
I am exactly answering this question in the following tutorial:
http://integratingstuff.wordpress.com/2010/10/18/writing-code-that-captures-videos-on-android/
The reason why your code is failing on prepare() is because you didn't set all the necessary properties. For example, you also need to set maxDuration.
I had the same question. I was going from a background audio recording service, and hoping to create a background video recording service. You can't truly record background video, but you can make the video preview very small in your existing UI. I followed the tutorial: http://integratingstuff.wordpress.com/2010/10/18/writing-code-that-captures-videos-on-android/ and the sample Camera Preview demo. But ultimately the sample code in http://www.apress.com/downloadable/download/sample/sample_id/39/ was simple enough to tweek, but also complete enough to work with setCamera. I will post my solution here to save others time in their progression from toy examples, to a complex example with good quality background video recording (using front facing camera if necessary).
This is the source for an Android video recorder with "no" preview (the preview is a 1x1 pixel which simulates an unobtrusive recording led), to record video without distracting users. To use your own UI, simply change the video_recorder.xml to your layout (be sure to keep the VideoView). It was tested on Android 2.2 and 3.0 devices.
Suitable use cases:
- eye gaze tracking library to let users use eyes as a mouse to navigate a web page
- use tablet camera(s) to replace video camera in lab/clinic experiments (psycholingusitics or speech pathology)
Layout xml:
<?xml version="1.0" encoding="utf-8"?>
<!-- This file is /res/layout/video_recorder.xml based on listing 9-6 in Pro Android 2 -->
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical" android:layout_width="fill_parent"
android:layout_height="fill_parent">
<RelativeLayout android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:gravity="center">
<VideoView android:id="@+id/videoView" android:layout_width="1px"
android:layout_height="1px" />
</RelativeLayout>
</LinearLayout>
Java class:
import java.io.File;
import android.app.Activity;
import android.hardware.Camera;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.util.Log;
import android.view.MotionEvent;
import android.view.SurfaceHolder;
import android.widget.Toast;
import android.widget.VideoView;
/**
* Android video recorder with "no" preview (the preview is a 1x1 pixel which
* simulates an unobtrusive recording led). Based on Pro Android 2 2010 (Hashimi
* et al) source code in Listing 9-6.
*
* Also demonstrates how to use the front-facing and back-facing cameras.
* A calling Intent can pass an Extra to use the front facing camera if available.
*
* Suitable use cases:
* A: eye gaze tracking library to let users use eyes as a mouse to navigate a web page
* B: use tablet camera(s) to replace video camera in lab experiments
* (psycholingusitics or other experiments)
*
* Video is recording is controlled in two ways:
* 1. Video starts and stops with the activity
* 2. Video starts and stops on any touch
*
* To control recording in other ways see the try blocks of the onTouchEvent
*
* To incorporate into project add these features and permissions to
* manifest.xml:
*
* <uses-feature android:name="android.hardware.camera"/>
* <uses-feature android:name="android.hardware.camera.autofocus"/>
*
* <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
* <uses-permission android:name="android.permission.CAMERA" />
* <uses-permission android:name="android.permission.RECORD_AUDIO" />
*
* Tested Date: October 2 2011 with manifest.xml
* <uses-sdk android:minSdkVersion="8" android:targetSdkVersion="11"/>
*/
public class VideoRecorderSubExperiment extends Activity implements
SurfaceHolder.Callback {
public static final String EXTRA_USE_FRONT_FACING_CAMERA ="frontcamera";
private static final String OUTPUT_FILE = "/sdcard/videooutput";
private static final String TAG = "RecordVideo";
private Boolean mRecording = false;
private Boolean mUseFrontFacingCamera = false;
private VideoView mVideoView = null;
private MediaRecorder mVideoRecorder = null;
private Camera mCamera;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.video_recorder);
mVideoView = (VideoView) this.findViewById(R.id.videoView);
//mUseFrontFacingCamera = getIntent().getExtras().getBoolean(
// EXTRA_USE_FRONT_FACING_CAMERA, true);
if(mUseFrontFacingCamera){
// If caller wants to use front facing camera, then make sure the device has one...
// Hard coded to only open front facing camera on Xoom (model MZ604)
// For more universal solution try:
// http://stackoverflow.com/questions/2779002/how-to-open-front-camera-on-android-platform
String deviceModel = android.os.Build.MODEL;
if (deviceModel.contains("MZ604")) {
mUseFrontFacingCamera = true;
} else {
Toast.makeText(
getApplicationContext(),
"The App isn't designed to use this Android's front facing camera.\n " +
"The device model is : " + deviceModel, Toast.LENGTH_LONG).show();
mUseFrontFacingCamera = false;
}
}
final SurfaceHolder holder = mVideoView.getHolder();
holder.addCallback(this);
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public boolean onTouchEvent(MotionEvent event) {
// can use the xy of the touch to start and stop recording
float positionX = event.getX();
float positionY = event.getY();
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
// Screen is pressed for the first time
break;
case MotionEvent.ACTION_MOVE:
// Screen is still pressed, float have been updated
break;
case MotionEvent.ACTION_UP:
// Screen is not touched anymore
if (mRecording) {
// To stop recording attach this try block to another event listener,
// button etc
try {
stopRecording();
} catch (Exception e) {
Log.e(TAG, e.toString());
e.printStackTrace();
}
} else {
// To begin recording attach this try block to another event listener,
// button etc
try {
beginRecording(mVideoView.getHolder());
} catch (Exception e) {
Log.e(TAG, e.toString());
e.printStackTrace();
}
}
break;
}
return super.onTouchEvent(event);
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
try {
beginRecording(holder);
} catch (Exception e) {
Log.e(TAG, e.toString());
e.printStackTrace();
}
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
Log.v(TAG, "Width x Height = " + width + "x" + height);
}
private void stopRecording() throws Exception {
mRecording = false;
if (mVideoRecorder != null) {
mVideoRecorder.stop();
mVideoRecorder.release();
mVideoRecorder = null;
}
if (mCamera != null) {
mCamera.reconnect();
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
}
@Override
protected void onDestroy() {
try {
stopRecording();
} catch (Exception e) {
Log.e(TAG, e.toString());
e.printStackTrace();
}
super.onDestroy();
}
/**
* Uses the surface defined in video_recorder.xml
* Tested using
* 2.2 (HTC Desire/Hero phone) -> Use all defaults works, records back facing camera with AMR_NB audio
* 3.0 (Motorola Xoom tablet) -> Use all defaults doesn't work, works with these specs, might work with others
*
* @param holder The surfaceholder from the videoview of the layout
* @throws Exception
*/
private void beginRecording(SurfaceHolder holder) throws Exception {
if (mVideoRecorder != null) {
mVideoRecorder.stop();
mVideoRecorder.release();
mVideoRecorder = null;
}
if (mCamera != null) {
mCamera.reconnect();
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
String uniqueOutFile = OUTPUT_FILE + System.currentTimeMillis() + ".3gp";
File outFile = new File(uniqueOutFile);
if (outFile.exists()) {
outFile.delete();
}
try {
if (mUseFrontFacingCamera) {
//hard coded assuming 1 is the front facing camera
mCamera = Camera.open(1);
} else {
mCamera = Camera.open();
}
// Camera setup is based on the API Camera Preview demo
mCamera.setPreviewDisplay(holder);
Camera.Parameters parameters = mCamera.getParameters();
parameters.setPreviewSize(640, 480);
mCamera.setParameters(parameters);
mCamera.startPreview();
mCamera.unlock();
mVideoRecorder = new MediaRecorder();
mVideoRecorder.setCamera(mCamera);
// Media recorder setup is based on Listing 9-6, Hashimi et all 2010
// values based on best practices and good quality,
// tested via upload to YouTube and played in QuickTime on Mac Snow Leopard
mVideoRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mVideoRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mVideoRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);// THREE_GPP
// is big-endian,
// storing and
// transferring
// the most
// significant
// bytes first.
// MPEG_4 as another option
mVideoRecorder.setVideoSize(640, 480);// YouTube recommended size: 320x240,
// OpenGazer eye tracker: 640x480
// YouTube HD: 1280x720
mVideoRecorder.setVideoFrameRate(20); //might be auto-determined due to lighting
mVideoRecorder.setVideoEncodingBitRate(3000000);// 3 megapixel, or the max of
// the camera
mVideoRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);// MPEG_4_SP
// Simple Profile is
// for low bit
// rate and low
// resolution
// H264 is MPEG-4 Part 10
//is commonly referred to
// as H.264 or AVC
int sdk = android.os.Build.VERSION.SDK_INT;
// Gingerbread and up can have wide band ie 16,000 hz recordings
// (Okay quality for human voice)
if (sdk >= 10) {
mVideoRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_WB);
mVideoRecorder.setAudioSamplingRate(16000);
} else {
// Other devices only have narrow band, ie 8,000 hz
// (Same quality as a phone call, not really good quality for any purpose.
// For human voice 8,000 hz means /f/ and /th/ are indistinguishable)
mVideoRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
}
mVideoRecorder.setMaxDuration(30000); // limit to 30 seconds
mVideoRecorder.setPreviewDisplay(holder.getSurface());
mVideoRecorder.setOutputFile(uniqueOutFile);
mVideoRecorder.prepare();
mVideoRecorder.start();
mRecording = true;
} catch (Exception e) {
Log.e(TAG, e.toString());
e.printStackTrace();
}
}
}
Maybe the Camera application's source helps you debug this.
Have you checked this out?
http://code.google.com/p/android/issues/detail?id=5050
These guys suggest that it is a timing issue, and that the MediaRecorder state machine may require some delay (hardware dependent?) between states.
It would be nice if there were callbacks for when each state was fully achieved - then we could just put prepare in that.
This could be a permissions error. Do you have the android.permission.CAMERA permission set in your AndroidManifest file?
In my case, copying and pasting the samples above didn't work.
Then, viewing the methods in MediaRecorder, I found setPreviewDisplay.
I called this method passing the surface used in Camera.setPreviewDisplay, the IOException in .prepare is gone and I was able to record video.
Try yourself and post your results.