How to obtain the RGB colors of the camera texture

2019-07-03 00:18发布

问题:

I'm trying to port a .NET app to Android where I capture each frame from the camera and then modify it accordingly to user settings before displaying it. Doing it in .NET was simple since I was able to simply query the camera for the next image and I would get a bitmap that I could access at will.

One of the many processing options requires the application to obtain the intensity histogram of each captured image and then do some modifications to the captured image before displaying the result (based on user settings). What I'm attempting to do is to capture and modify the camera preview in Android.

I understand that the "best" way to achieve some sort of real time-ish camera processing is by using OpenGL as the preview framework by using a GLES11Ext.GL_TEXTURE_EXTERNAL_OES texture.

I am able to capture the preview and do some processing in my fragment shader like turning the image gray scale, modifying the colors of the fragment, threshold clipping, etc., but to do stronger processing like computing histogram, or applying Fast Fourier Transform, I need (fast) access (read/write) to all the pixels in RGB format in the captured image contained in the texture before displaying it.

I'm using Java with OpenGL ES 2.0 for Android.

My current code for drawing does the following:

private int mTexture; // texture handle created to hold the captured image
...

// Called for every captured frame
public void onDraw()
{
    int mPositionHandle;
    int mTextureCoordHandle;

    GLES20.glUseProgram(mProgram);

    GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
    GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTexture);

    // TODO:
    //  Obtain RGB pixels of the texture and manipulate here.

    // TODO:
    //  Put the resulting RGB pixels in a texture for display.


    // prepare for drawing
    mPositionHandle = GLES20.glGetAttribLocation(mProgram, "position");
    GLES20.glEnableVertexAttribArray(mPositionHandle);
    GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false,vertexStride, vertexBuffer);

    mTextureCoordHandle = GLES20.glGetAttribLocation(mProgram, "inputTextureCoordinate");
    GLES20.glEnableVertexAttribArray(mTextureCoordHandle);
    GLES20.glVertexAttribPointer(mTextureCoordHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false,vertexStride, textureVerticesBuffer);

    // draw the texture
    GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length,
            GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
}

My vertex and fragment shaders are very simple:

Vertex shader:

attribute vec4 position;
attribute vec2 inputTextureCoordinate;
varying vec2 textureCoordinate;
void main()
{
    gl_Position = position;
    textureCoordinate = inputTextureCoordinate;
}

Fragment shader (accesses the captured image directly):

/* Shader: Gray Scale*/
#extension GL_OES_EGL_image_external : require
precision mediump float;
varying vec2 textureCoordinate;
uniform samplerExternalOES s_texture;
void main()
{
    float clr = dot(texture2D(s_texture, textureCoordinate), vec4(0.299, 0.587, 0.114, 0.0));
    gl_FragColor = vec4(clr, clr, clr, 1.0);
}

It would be ideal if I were able to obtain the width and height of the captured texture and be able to obtain and modify (or be able to write into another texture) the RGB value for every pixel in the capture, such as in an array of bytes where each byte represented a color channel, for processing before displaying.

I am starting to learn OpenGL ES and I got this project on the way. Any help is deeply appreciated, thank you.