Am tryin' to take the preview frame generated by the android camera and pass the data[]
to ffmpeg input pipe to generate a flv video.
The command that I used was :
ffmpeg -f image2pipe -i pipe: -f flv -vcodec libx264 out.flv
I've also tried to force the input format to yuv4mpegpipe
and rawvideo
but with no success...
The default format of the preview frame generated by android-camera is NV21
.
The way am invokin' ffmpeg is through the Process API
and writing the preview frames data[]
to the process' stdin
...
The onPreviewFrame()
definition is as follows :
public void onPreviewFrame(byte[] data, Camera camera)
{
try
{
processIn.write(data);
}
catch(Exception e)
{
Log.e(TAG, FUNCTION + " : " + e.getMessage());
}
camera.addCallbackBuffer(new byte[bufferSize]);
}
processIn
is connected to the ffmpeg
process stdin
and buffersize
is computed based on the documentation provided for addCallbackBuffer()
.
Is there something that am doin' wrong...?
Thanks...
Kinda got it working perfectly... The mistake that seemed to be happenin' was related to the
vcodec
of the image stream. Seems that ffmpeg has no provision to decodeNV21
format images or image stream. For that had to convert theNV21
format preview frame toJPEG
and as the images had to streamed in real time to theffmpeg
process ,the conversion had to beOn the Fly
. The closest reliable solution forOn the Fly
conversion toJPEG
was as follows :And the
ffmpeg
command used was :