Android: How to integrate a decoder to multimedia

2019-01-06 10:41发布

Recently i have ported a video decoder to android successfully. Also dumped the output on a surfaceview and checked the output using native API's. Now the next task is to implement play, pause, streaming etc. i.e. the other features of the media player. Doing this will be a rework as all these functionalities are already defined in the android multimedia framework. I heard that we can make our decoder as a plug-in and integrate it into Android's multimedia framework. Although i googled regarding the same, i could hardly find any info regarding the same. So i kindly request any of the readers to suggest some relavent links or solution for the above problem. Thanks in advance, waiting for your reply.

1条回答
爷、活的狠高调
2楼-- · 2019-01-06 11:40

In Android SF framework, the codecs are registered through media_codecs.xml. In standard android distribution, an example media_codecs.xml can be found here. All audio-visual components are registered as OMX components.

1. Codec Registration

To register your video decoder, you would have to add a new entry under <Decoders> list. To ensure that your codec is always picked up, please ensure that your codec is listed as the first entry for the specific MIME type. An example entry for a H.264 decoder could be as below.

<Decoders>
    <MediaCodec name="OMX.ABC.XYZ.H264.DECODER" type="video/avc" >
        <Quirk name="requires-allocate-on-input-ports" />
        <Quirk name="requires-allocate-on-output-ports" />
    </MediaCodec>
    <MediaCodec name="OMX.google.h264.decoder" type="video/avc" />

Where,

a.OMX.ABC.XYZ.H264.Decoder is the name of your component

b. video/avc is the MIME type of your component. In this example, it denotes a AVC / H.264 video decoder.

c.The next 2 statements denote the quirks or special requirements of your components. In the given example, requires-allocate-on-input-ports indicates to the Stagefright framework that the component prefers to allocate the buffers on all it's input ports. Similarly, the other quirk is informing that the component will also prefer to allocate on it's output ports. For a list of supported quirks in the system, you could refer to the function OMXCodec::getComponentQuirks in OMXCodec.cpp file. These quirks translate into flags which are then read by the framework to create and initialize the components.

In the example illustration, it is shown that your OMX component is registered prior to the default Google implemented video decoder.

NOTE: If you trying this on an end device, you will have to ensure that this entry is reflected in the final media_codecs.xml file.

2. OMX Core Registration

To create your component and ensure that the correct factory method is invoked, you may have to register your OMX Core with the Stagefright framework.

To register a new core, you will have to create a new library named libstagefrighthw.so which will be located at /system/lib in your end system. This library will have to expose a createOMXPlugin symbol which will be looked by dlsym.

The registration of the OMX core is thus: OMXMaster invokes addVendorPlugin which internally invokes addPlugin("libstagefrighthw.so"). In addPlugin, the createOMXPlugin will be looked up using which the other function pointers for makeComponentInstance, destroyComponentInstance etc are initialized.

Once the OMX core is initialized, you are ready to run your own your component within the android framework. The reference for OMXMaster can be found here.

With these changes, your video decoder is integrated into the android stagefright framework.

查看更多
登录 后发表回答