Android OpenGL combination of SurfaceTexture (exte

2019-02-03 13:02发布

i would like to mix camera preview SurfaceTexture with some overlay texture. I am using these shaders for processing:

    private final String vss = "attribute vec2 vPosition;\n"
        + "attribute vec2 vTexCoord;\n"
        + "varying vec2 texCoord;\n"
        + "void main() {\n" 
        + "  texCoord = vTexCoord;\n"
        + "  gl_Position = vec4 ( vPosition.x, vPosition.y, 0.0, 1.0 );\n"
        + "}";

private final String fss = "#extension GL_OES_EGL_image_external : require\n"
        + "precision mediump float;\n"
        + "uniform samplerExternalOES sTexture;\n"
        + "uniform sampler2D filterTexture;\n"
        + "varying vec2 texCoord;\n"
        + "void main() {\n"
        +"  vec4 t_camera = texture2D(sTexture,texCoord);\n"
        //+"  vec4 t_overlayer = texture2D(filterTexture, texCoord);\n" 
        //+ "  gl_FragColor = t_overlayer;\n" + "}";
        + "  gl_FragColor = t_camera;\n" + "}";

My goal is to mix t_camera and t_overlayer. When i show t_camera or t_overlayer separately, it works (showing camera preview or texture). But when i uncomment t_overlayer, then t_camera became black (somehow badly sampled). My overlayer texture is 512x512 and CLAMPT_TO_EDGE. This problem occurs only for example on: Android Emulator, HTC Evo 3D. But on SGS3, HTC One X, it works just fine.

What is wrong? Is it Evo 3D missing some extension or what?

7条回答
贼婆χ
2楼-- · 2019-02-03 13:28

It appears to be a bug in OpenGl implementation. The same code worked fine on Samsung Note and not on nexus 4. It appears that getUniformLocation breaks on some devices for all variables that are located past samplerExternalOES.

it also appears that compiler sorts uniform variables alphabeticaly, so the solution that made it work on both devices was to rename your samplerExternalEoz to be zzzTexture or something.

查看更多
迷人小祖宗
3楼-- · 2019-02-03 13:34

I got the same issue on my Nexus 7 and it drove me crazy. Accessing either a samplerExternalOES or a sampler2D worked totally fine, but accessing them both in the same shader gave unexpected results. Sometimes the output would be black. Sometimes the output of one of the lookup would have bad quantization artifacts. The behaviour would also vary depending on the texture unit the samplers where bound to. I did check every opengl error and validateProgram results.

Eventually, what worked was to use a separate shader to simply access the camera output and render that into a texture. Then the resulting texture can be accessed through a regular sampler2D and everything works exactly as expected. I suspect there's a bug somewhere related to samplerExternalOES.

查看更多
手持菜刀,她持情操
4楼-- · 2019-02-03 13:36

I might have the same problem as well. after days of trying, I am proposing my solutions here. Hoping this would help others.

firstly, problem statement. just like Lukáš Jezný, I have one preview texture and one overlay texture. it works fine for nexus 4/5 and most of other types, but shows nothing on OPPO find 5, Lenovo A820, Lenovo A720.

solution:

(1)just like Lukáš Jezný, use YUV data and transforming them to RGB in the shader.

(2)multipass drawing, draw the preview texture to the framebuffer once , and read it, then draw it again to the screen.

(3)use another program before you use your own program,

    GLES20.glUseProgram(another one);
    GLES20.glUseProgram(your "real" program);

and it just works for OPPO find 5, Lenovo A820, Lenovo A720 and others. No one knows why......

查看更多
该账号已被封号
5楼-- · 2019-02-03 13:38

The method on the above save lots of my time. Thank you guru:

GLES20.glUniform1i(sTextureHandle, 1);
GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
    sTextureId);

For your 2D texture:

GLES20.glUniform1i(filterTextureHandle, 0);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, filterTextureID);

Change the texture index is a good way to solve this.

查看更多
劫难
6楼-- · 2019-02-03 13:40

This is not an answer, but rather an elaboration of the question - maybe it will help an OpenGl ES expert to get an idea of the problem.


I have 3 textures used for overlay, and one external texture, used to capture output from media player. If I use only external texture, output is as expected, frames from MPlayer. The same full code works fine on Nexus4, Samsung Galaxy S3, S4, etc. (all devices use either adreno gpus, or Arm's Mali400) The difference in hardware is that Nexus 7 uses Nvidia Tegra 3 board.


Edit (how was solved on my side):

Nvidia Tegra 3 requires that external texture sampler is called with a name with lowest alphabetical ordering among samplers, while Adreno 220 seem to require the reverse. Also, T3 require that external texture is sampled last. With devices using Android 4.3 and newer, these bugs may be solved. On Nvidia side, it was a bug, solved long ago but Nexus drivers were updated only later. So I had to check which gpu was present, and adapt code accordingly.

查看更多
祖国的老花朵
7楼-- · 2019-02-03 13:44

Referring to user1924406's post (https://stackoverflow.com/a/14050597/3250829) on splitting up accessing a sampler2D texture and samplerExternalOES texture, this is what I had to do because the application that I am developing is reading from a file or streaming from a server instead of using the Camera that is on the device. Using both textures in the same shader resulted in very weird colourization artifacts (the case on a Galaxy S3) or saturation and contrast issues (the case on a Nexus 4).

As such, the only way to work around the samplerExternalOES texture bug (from what I've seen so far) is to do the two shader programs: One that writes the content contained in the samplerExternalOES texture to a FBO and the other that takes the content from the FBO and writes it directly to the surface.

One thing that you need to check is that sometimes when you write to a FBO, the texture co-ordinates flip. In my case, the V (or T or Y) co-ordinate was flipped which resulted in a mirrored image across the horizontal axis. I had to take this into account when writing the fragment shader at the second stage.

This is a war story that I wanted to share in case there are some of you that may need to read from a file or stream from a server instead of taking directly from the Camera.

查看更多
登录 后发表回答