Android MediaCodec slower in async-mode than in sy

2019-03-31 04:20发布

问题:

again a question of mine regarding androids MediaCodec class. I have successfully managed to decode raw h264 content and display the result in two TextureViews. The h264 stream comes from a server that is running an openGL scene. The scene has a camera and is therefore responsive to users input. To further reduce the latency between an input on the server and the actual result on the smartphone I was thinking about using MediaCodecs Async mode. Here is how i set up both variants, synchronous and asynchronous:

Async:

//decoderCodec is "video/avc"
MediaFormat fmt = MediaFormat.createVideoFormat(decoderCodec, 1280,720);
codec.setCallback(new MediaCodec.Callback() {

    @Override
    public void onInputBufferAvailable(MediaCodec codec, int index) {
        byte[] frameData;
        try {
            frameData = frameQueue.take(); //this call is blocking
        } catch (InterruptedException e) {
            return;
        }

        ByteBuffer inputData = codec.getInputBuffer(index);
        inputData.clear();
        inputData.put(frameData);

        codec.queueInputBuffer(index, 0, frameData.length, 0, 0);
    }

    @Override
    public void onOutputBufferAvailable(MediaCodec codec, int index, MediaCodec.BufferInfo info) {
        codec.releaseOutputBuffer(index, true);
    }

     //The two other methods are left blank at the moment.

});


codec.configure(fmt, surface, null, 0);
codec.start();

Sync: (is setup like Async except the codec.setCallback(...) part. The class both variants reside in is Runnable.

public void run() {

    while(!Thread.interrupted())
    {
        if(!IS_ASYNC) {
            byte[] frameData;
            try {
                frameData = frameQueue.take(); //this call is blocking
            } catch (InterruptedException e) {
                break;
            }

            int inIndex = codec.dequeueInputBuffer(BUFFER_TIMEOUT);

            if (inIndex >= 0) {
                ByteBuffer input = codec.getInputBuffer(inIndex);
                input.clear();
                input.put(frameData);
                codec.queueInputBuffer(inIndex, 0, frameData.length, 0, 0);
            }

            MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
            int outIndex = codec.dequeueOutputBuffer(bufferInfo, BUFFER_TIMEOUT);

            if(outIndex >= 0)
                codec.releaseOutputBuffer(outIndex, true);
        }
        else sleep(3000); //Just for testing, if we are in Async, this thread has nothing to do actually...
    }
}

Both approaches work, but I'm observing that the videos played in synchronous-mode are much smoother and the latency is also lower.

I came up with the idea of using the async mode because frameQueue is a LinkedBlockingDeque and I thought, if the synchronous decoder is waiting too long for new frame data to arrive, decoded output may already be available but is not displayed because of the blocking nature of the queue. On the other hand I dont wanted to do something like busy waiting and poll the queue, the inputBuffers and outputBuffers all the time.

So I tried the AsyncMode using the Callbacks, but the result I get is worse than in synchronous mode. The question to you guys now is: Why? Do I misuse the async mode or is it something else?

Thanks for any feedback!

Christoph

回答1:

I would not be surprised if the blocking call in onInputBufferAvailable is the culprit. It feels probable that both onInputBufferAvailable and onOutputBufferAvailable are called within the same thread, and if you block in one, you stop the other one from running.

I would suggest changing it so that you in onInputBufferAvailable just push the buffer index onto some queue, and signal a different thread that there's another buffer available now, then having this second thread wait for buffers from the queue, and do a blocking fetch of input data there.