I'm trying to stream video in android through ffmpeg,the output which i am getting after the decoding is YUV format.Is it possible to render YUV image format directly in the Android screen?
相关问题
- How can I create this custom Bottom Navigation on
- Bottom Navigation View gets Shrink Down
- How to make that the snackbar action button be sho
- Listening to outgoing sms not working android
- How to create Circular view on android wear?
相关文章
- android开发 怎么把图片放入drawable的文件夹下
- android上如何获取/storage/emulated/下的文件列表
- androidStudio有个箭头不认识
- SQLite不能创建表
- Windows - Android SDK manager not listing any plat
- Animate Recycler View grid when number of columns
- Why is the app closing suddenly without showing an
- Handling ffmpeg library interface change when upgr
Yes and no.
The output of the camera and hardware video decoders is generally YUV. Frames from these sources are generally sent directly to the display. They may be converted by the driver, typically with a hardware scaler and format converter. This is necessary for efficiency.
There isn't an API to allow an app to pass YUV frames around the same way. The basic problem is that "YUV" covers a lot of ground. The buffer format used by the video decoder may be a proprietary internal format that the various hardware modules can process efficiently; for your app to create a surface in this format, it would have to perform a conversion, and you're right back where you were performance-wise.
You should be able to use GLES2 shaders to do the conversion for you on the way to the display, but I don't have a pointer to code that demonstrates this.
Update: an answer to this question has a link to a WebRTC source file that demonstrates doing the YUV conversion in a GLES2 shader.