Android Camera Capture using FFmpeg

2019-01-22 22:58发布

Am tryin' to take the preview frame generated by the android camera and pass the data[] to ffmpeg input pipe to generate a flv video. The command that I used was :

ffmpeg -f image2pipe -i pipe: -f flv -vcodec libx264 out.flv

I've also tried to force the input format to yuv4mpegpipe and rawvideo but with no success... The default format of the preview frame generated by android-camera is NV21. The way am invokin' ffmpeg is through the Process API and writing the preview frames data[] to the process' stdin... The onPreviewFrame() definition is as follows :

public void onPreviewFrame(byte[] data, Camera camera)
{   
    try
    {
        processIn.write(data);
    }
    catch(Exception e)
    {
        Log.e(TAG, FUNCTION + " : " + e.getMessage());
    }               
    camera.addCallbackBuffer(new byte[bufferSize]);
}

processIn is connected to the ffmpeg process stdin and buffersize is computed based on the documentation provided for addCallbackBuffer(). Is there something that am doin' wrong...?

Thanks...

1条回答
Viruses.
2楼-- · 2019-01-22 23:55

Kinda got it working perfectly... The mistake that seemed to be happenin' was related to the vcodec of the image stream. Seems that ffmpeg has no provision to decode NV21 format images or image stream. For that had to convert the NV21 format preview frame to JPEG and as the images had to streamed in real time to the ffmpeg process ,the conversion had to be On the Fly. The closest reliable solution for On the Fly conversion to JPEG was as follows :

public void onPreviewFrame(byte[] data, Camera camera)
{
        if(isFirstFrame)
    {
        Camera.Parameters cameraParam = camera.getParameters();
        Camera.Size previewSize = cameraParam.getPreviewSize();
        previewFormat = cameraParam.getPreviewFormat();
        frameWidth = previewSize.width;
        frameHeight = previewSize.height;
        frameRect = new Rect(0, 0, frameWidth, frameHeight);
        isFirstFrame = false;
    }

    previewImage = new YuvImage(data, previewFormat, frameWidth, frameHeight, null);

    if(previewImage.compressToJpeg(frameRect, 50, processIn))
        Log.d(TAG, "Data : " + data.length);

    previewImage = null;

    camera.addCallbackBuffer(new byte[bufferSize]);
}

And the ffmpeg command used was :

ffmpeg -f image2pipe -vcodec mjpeg -i - -f flv -vcodec libx264 out.flv
查看更多
登录 后发表回答