I have searched for the solution and found - Java CV - FFmpeg
Need step by step to integrate them like this
I have searched for the solution and found - Java CV - FFmpeg
Need step by step to integrate them like this
Steps for Integrating FFMPEG
(1.) Download ffmpeg from here: http://bambuser.com/opensource. It contains scripts to build ffmpeg for android.
(2.) Modify build.sh. Replace "com.bambuser.broadcaster" with your package name. You also need to set the ffmpeg flags to enable to codecs you're interested in.
(3.) Run build.sh, and copy the build/ffmpeg directory into your apps jni/lib directory.
(4.) Use fasaxc's makefile from the SO post.
(5.) Create a native.c file in your jni directory and a java wrapper. To start with you can model it after hello-jni in NDK samples (/samples/hello-jni).
(6.) Include headers in your native.c file like this: #include "libavcodec/avcodec.h". And call the functions you need: avcodec_register_all(), etc...
(7.) Include the native libraries in your root activity by adding: static { System.loadLibraries(""); }
Besides, there is an important link of Liu Feipeng's Blog - http://www.roman10.net/how-to-build-ffmpeg-for-android/
If you can target latest API (4.1 to 4.3) you should try to play with this: http://developer.android.com/reference/android/media/MediaCodec.html
From changelog:
Android 4.1 (API level 16) added the MediaCodec class for low-level encoding and decoding of media content. When encoding video, Android 4.1 required that you provide the media with a ByteBuffer array, but Android 4.3 now allows you to use a Surface as the input to an encoder. For instance, this allows you to encode input from an existing video file or using frames generated from OpenGL ES.
Otherwise, If you want a fully custom solution, you could try the native way and implement your encoder using JNI and libwebm (from Google) http://www.webmproject.org/code/