I want to do camera image processing on the GPU on android.
In my current set-up I use a SurfaceTexture to capture frames from the camera image stream as an OpenGL ES texture. This is an efficient way to get the camera stream accesible in my shaders. (http://developer.android.com/reference/android/graphics/SurfaceTexture.html)
Now i would like to start using the new RenderScript API instead of direct OenGL ES usage. (http://developer.android.com/guide/topics/renderscript/index.html)
But to create a SurfaceTexture, i need to pass the openGl Texture ID to the constructor. Unfortunately the texture ID is not available (RenderScript uses the Allocation class to load textures, which does not expose the texture ID). So i am not able to create a SurfaceTexture when using RenderScript.
I have read all documentation on renderscript (which is still pretty sparse) and looked at the samples, but they have no information on the subject.
So my question is: Is it possible to use SurfaceTexture in combination with RenderScript, or is there some other efficient way to use the live camera stream in a RenderScript Graphics script?