Android Video Player Using NDK, OpenGL ES, and FFm

2019-01-29 19:11发布

Ok so here is what I have so far. I have built FFmpeg on android and am able to use it fine. I have been able to load a video into FFmpeg after passing the chosen filename from the java side. To save on performance I am writing video player in the NDK rather than passing frames from FFmpeg to java through JNI. I want to send frames from the video to an OpenGL surface. I am having trouble figuring out how to get each frame of video and render it onto the OpenGL surface. I have been stuck trying to figure this out for a couple weeks now with no luck. Hopefully someone can point me in the right direction.

Thanks!

1条回答
▲ chillily
2楼-- · 2019-01-29 19:22

One way that springs to mind is to draw the pixels of your frame into a texture and then render that texture using OpenGL.

I wrote a blog post a while back on how to go about this, primarily for old-skool pixel-based video games, but it also applies for your situation. The post is Android Native Coding in C, and I set up a github repository with an example. Using this technique I have been able to get 60 FPS, even on first generation hardware.

EDIT regarding glTexImage2D vs glTexSubImage2D for this approach.

Calling glTexImage2D will allocate video memory for your texture and copy the pixels you pass it into that memory (if you don't pass NULL). Calling glTexSubImage2D will update the pixels you specify in an already-allocated texture.

If you update all of the texture then there's little difference calling one or the other, in fact glTexImage2D is usually faster. But if you only update part of the texture glTexSubImage2D wins out on speed.

You have to use power-of-2 texture sizes, so in covering the screen on hi-res devices requires a 1024x512 texture, and a 512x512 texture on medium resolutions. The texture is larger than the screen area (hi-res is 800x400-ish), which means you only need to update part of it, so glTexSubImage2D is the way to go.

查看更多
登录 后发表回答