I am trying to develop an app that allows me to draw on a video while recording it, and to then save both the recording and the video in one mp4 file for later use. Also, I want to use the camera2 library, especially that I need my app to run for devices higher than API 21, and I am always avoiding deprecated libraries.
I tried many ways to do it, including FFmpeg in which I placed an overlay of the TextureView.getBitmap() (from the camera) and a bitmap taken from the canvas. It worked but since it is a slow function, the video couldn't catch enough frames (not even 25 fps), and it ran so fast. I want audio to be included as well.
I thought about the MediaProjection library, but I am not sure if it can capture the layout containg the camera and the drawing only inside its VirtualDisplay, because the app user may add text as well on the video, and I don't want the keyboard to appear.
Please help, it's been a week of research and I found nothing that worked fine for me.
P.S: I don't have a problem if a little bit of processing time is included after that the user presses the "Stop Recording"button.
EDITED:
Now after Eddy's Answer, I am using the shadercam app to draw on the camera surface since the app does the video rendering, and the workaround to do is about rendering my canvas into a bitmap then into a GL texture, however I am not being able to do it successfully. I need your help guys, I need to finish the app :S
I am using the shadercam library (https://github.com/googlecreativelab/shadercam), and I replaced the "ExampleRenderer" file with the following code:
public class WriteDrawRenderer extends CameraRenderer
{
private float offsetR = 1f;
private float offsetG = 1f;
private float offsetB = 1f;
private float touchX = 1000000000;
private float touchY = 1000000000;
private Bitmap textBitmap;
private int textureId;
private boolean isFirstTime = true;
//creates a new canvas that will draw into a bitmap instead of rendering into the screen
private Canvas bitmapCanvas;
/**
* By not modifying anything, our default shaders will be used in the assets folder of shadercam.
*
* Base all shaders off those, since there are some default uniforms/textures that will
* be passed every time for the camera coordinates and texture coordinates
*/
public WriteDrawRenderer(Context context, SurfaceTexture previewSurface, int width, int height)
{
super(context, previewSurface, width, height, "touchcolor.frag.glsl", "touchcolor.vert.glsl");
//other setup if need be done here
}
/**
* we override {@link #setUniformsAndAttribs()} and make sure to call the super so we can add
* our own uniforms to our shaders here. CameraRenderer handles the rest for us automatically
*/
@Override
protected void setUniformsAndAttribs()
{
super.setUniformsAndAttribs();
int offsetRLoc = GLES20.glGetUniformLocation(mCameraShaderProgram, "offsetR");
int offsetGLoc = GLES20.glGetUniformLocation(mCameraShaderProgram, "offsetG");
int offsetBLoc = GLES20.glGetUniformLocation(mCameraShaderProgram, "offsetB");
GLES20.glUniform1f(offsetRLoc, offsetR);
GLES20.glUniform1f(offsetGLoc, offsetG);
GLES20.glUniform1f(offsetBLoc, offsetB);
if (touchX < 1000000000 && touchY < 1000000000)
{
//creates a Paint object
Paint yellowPaint = new Paint();
//makes it yellow
yellowPaint.setColor(Color.YELLOW);
//sets the anti-aliasing for texts
yellowPaint.setAntiAlias(true);
yellowPaint.setTextSize(70);
if (isFirstTime)
{
textBitmap = Bitmap.createBitmap(mSurfaceWidth, mSurfaceHeight, Bitmap.Config.ARGB_8888);
bitmapCanvas = new Canvas(textBitmap);
}
bitmapCanvas.drawText("Test Text", touchX, touchY, yellowPaint);
if (isFirstTime)
{
textureId = addTexture(textBitmap, "textBitmap");
isFirstTime = false;
}
else
{
updateTexture(textureId, textBitmap);
}
touchX = 1000000000;
touchY = 1000000000;
}
}
/**
* take touch points on that textureview and turn them into multipliers for the color channels
* of our shader, simple, yet effective way to illustrate how easy it is to integrate app
* interaction into our glsl shaders
* @param rawX raw x on screen
* @param rawY raw y on screen
*/
public void setTouchPoint(float rawX, float rawY)
{
this.touchX = rawX;
this.touchY = rawY;
}
}
Please help guys, it's been a month and I am still stuck with the same app :( and have no idea about opengl. Two weeks and I'm trying to use this project for my app, and nothing is being rendered on the video.
Thanks in advance!