How to use Android MediaCodec encode Camera data(Y

2019-02-04 15:54发布

问题:

Thank you for your focus! I want to use Android MediaCodec APIs to encode the video frame which aquired from Camera, unfortunately, I have not success to do that! I still not familiar with the MediaCodec API。 The follow is my codes,I need your help to figure out what I should do.

1、The Camera setting:

Parameters parameters = mCamera.getParameters();
parameters.setPreviewFormat(ImageFormat.NV21);
parameters.setPreviewSize(320, 240);
mCamera.setParameters(parameters);

2、Set the encoder:

private void initCodec() {
    try {
        fos = new FileOutputStream(mVideoFile, false);
    } catch (FileNotFoundException e) {
        e.printStackTrace();
    }
    mMediaCodec = MediaCodec.createEncoderByType("video/avc");
    MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc",
            320,
            240);
    mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 125000);
    mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
    mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
            MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
    mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
    mMediaCodec.configure(mediaFormat,
            null,
            null,
            MediaCodec.CONFIGURE_FLAG_ENCODE);
    mMediaCodec.start();
    inputBuffers = mMediaCodec.getInputBuffers();
    outputBuffers = mMediaCodec.getOutputBuffers();
}

private void encode(byte[] data) {
    int inputBufferIndex = mMediaCodec.dequeueInputBuffer(0);
    if (inputBufferIndex >= 0) {
        ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
        inputBuffer.clear();
        inputBuffer.put(data);
        mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0);
    } else {
        return;
    }

    MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
    Log.i(TAG, "outputBufferIndex-->" + outputBufferIndex);
    do {
        if (outputBufferIndex >= 0) {
            ByteBuffer outBuffer = outputBuffers[outputBufferIndex];
            System.out.println("buffer info-->" + bufferInfo.offset + "--"
                    + bufferInfo.size + "--" + bufferInfo.flags + "--"
                    + bufferInfo.presentationTimeUs);
            byte[] outData = new byte[bufferInfo.size];
            outBuffer.get(outData);
            try {
                if (bufferInfo.offset != 0) {
                    fos.write(outData, bufferInfo.offset, outData.length
                            - bufferInfo.offset);
                } else {
                    fos.write(outData, 0, outData.length);
                }
                fos.flush();
                Log.i(TAG, "out data -- > " + outData.length);
                mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
                outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo,
                        0);
            } catch (IOException e) {
                e.printStackTrace();
            }
        } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
            outputBuffers = mMediaCodec.getOutputBuffers();
        } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
            MediaFormat format = mMediaCodec.getOutputFormat();
        }
    } while (outputBufferIndex >= 0);
}

I guess the problem occurred in the encoder method,the method will be used in the Camera Preview Callback ,like

initCodec();

//mCamera.setPreviewCallback(new MyPreviewCallback());
mCamera.setPreviewCallback(new PreviewCallback() {
    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
        encode(data);
    }
});

I just have no idea how to do it correctly with the MediaCodec API.Can you give me some advice or links about it?

Thank you!

回答1:

I have solved the problem.As follows:

private synchronized void encode(byte[] data)
{
    inputBuffers = mMediaCodec.getInputBuffers();// here changes
    outputBuffers = mMediaCodec.getOutputBuffers();

    int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
    Log.i(TAG, "inputBufferIndex-->" + inputBufferIndex);
    //......

And next,you will find your encoded video color is not right, for more information,please go to here MediaCodec and Camera: colorspaces don't match



回答2:

The YUV420 formats output by the camera are incompatible with the formats accepted by the MediaCodec AVC encoder. In the best case, it's essentially NV12 vs. NV21 (U and V planes are reversed), requiring a manual reordering. In the worst case, as of Android 4.2, the encoder input format may be device-specific.

You're better off using MediaRecorder to connect the camera hardware to the encoder.

Update: It's now possible to pass the camera's Surface preview to MediaCodec, instead of using the YUV data in the ByteBuffer. This is faster and more portable. See the CameraToMpegTest sample here.