I've done image warping using OpenCV in Python and C++, see the Coca Cola logo warped in place in the corners I had selected:
Using the following images:
and this:
Full album with transition pics and description here
I need to do exactly this, but in OpenGL. I'll have:
Corners inside which I've to map the warped image
A homography matrix that maps the transformation of the logo image into the logo image you see inside the final image (using OpenCV's warpPerspective), something like this:
[[ 2.59952324e+00, 3.33170976e-01, -2.17014066e+02], [ 8.64133587e-01, 1.82580111e+00, -3.20053715e+02], [ 2.78910149e-03, 4.47911310e-05, 1.00000000e+00]]
Main image (the running track image here)
Overlay image (the Coca Cola image here)
Is it possible ? I've read a lot and started OpenGL basics tutorials, but can it be done from just what I have? Would the OpenGL implementation be faster, say, around ~10ms?
I'm currently playing with this tutorial here: http://ogldev.atspace.co.uk/www/tutorial12/tutorial12.html Am I going in the right direction? Total OpenGL newbie here, please bear. Thanks.
[edit] This worked on my Galaxy S9 but on my car's Android it had an issue that the whole output texture was white. I've sticked to the original shader and it works :)
You can use mat3*vec3 ops in the fragment shader:
If you want to have transparent background don't forget to add
And set transpose flag (in case you use the above shader):
After trying a number of solutions proposed here and elsewhere, I ended solving this by writing a fragment shader that replicates what 'warpPerspective' does.
The fragment shader code looks something like:
Note that the homography matrix we are passing in here is the INVERSE HOMOGRAPHY MATRIX! You have to invert the homography matrix that you would pass into 'warpPerspective'- otherwise this code will not work.
The vertex shader does nothing but pass through the coordinates:
Pass in unaltered texture coordinates and position coordinates (i.e. textureCoordinates = [(0,0),(0,1),(1,0),(1,1)] and positionCoordinates = [(-1,-1),(-1,1),(1,-1),(1,1)], for a triangle strip), and this should work!
You can do perspective warping of the texture using
texture2DProj()
, or alternatively usingtexture2D()
by dividing thest
coordinates of the texture (which is whattexture2DProj
does).Have a look here: Perspective correct texturing of trapezoid in OpenGL ES 2.0.
warpPerspective
projects the (x,y,1) coordinate with the matrix and then divides(u,v)
byw
, liketexture2DProj()
. You'll have to modify the matrix so the resulting coordinates are properly normalised.In terms of performance, if you want to read the data back to the CPU your bottleneck is
glReadPixels
. How long it will take depends on your device. If you're just displaying, the OpenGL ES calls will take much less than 10ms, assuming that you have both textures loaded to GPU memory.