I have mouse coordinates: mousePos
, a matrix view view
, and a perspective projection matrix pMatrix
.
I translate the coordinates into the world: I find the inverse projection matrix and the inverse matrix view and multiply by the coordinates of the mouse. The coordinates of the origin is z = 4
, the coordinates of the end is z = -100
.
In the first case, I get the coordinates mouseDir1 = (-0.1985 0.02887 4)
, and in the second case mouseDir2 = (-0.1985 0.02887 -100)
.
Why are the coordinates x, y
the same?
private Vector3f getCoord(MouseInput mouseInput,float z){
int wdwWitdh = 640;
int wdwHeight =640;
Vector2d mousePos = mouseInput.getCurrentPos();
float x = (float)(2 * mousePos.x) / (float)wdwWitdh - 1.0f;
float y = 1.0f - (float)(2 * mousePos.y) / (float)wdwHeight;
Matrix4f invProjectionMatrix = new Matrix4f();
invProjectionMatrix.set(pMatrix);
invProjectionMatrix.invert();
Vector4f tmpVec = new Vector4f();
tmpVec.set(x, y, z, 0);
tmpVec.mul(invProjectionMatrix);
tmpVec.z = z;
tmpVec.w = 0.0f;
Matrix4f viewMatrix = new Matrix4f().set(view);
Matrix4f invViewMatrix = new Matrix4f();
invViewMatrix.set(viewMatrix);
invViewMatrix.invert();
tmpVec.mul(invViewMatrix);
Vector3f mouseDir1 = new Vector3f();
mouseDir1.set(tmpVec.x, tmpVec.y, tmpVec.z);
///ТЕСТОВАЯ ПРОВЕРКА Z=-100;
//конеч координаты луча
Vector4f tmpVec1 = new Vector4f();
tmpVec1.set(x, y, -100, 1.0f);
tmpVec1.mul(invProjectionMatrix);
tmpVec1.z =-100f;
tmpVec1.w = 0.0f;
tmpVec1.mul(invViewMatrix);
Vector3f mouseDir2 = new Vector3f();
mouseDir2.set(tmpVec1.x, tmpVec1.y, tmpVec1.z);
System.out.println();
return mouseDir1;
}
Output ray:
The matrix library you seem to be using, JOML, provides this operation (which is usually called "unprojecting") via the following code (adapted to your code):
Vector3f worldCoords = new Matrix4f(pMatrix).mul(view).unproject(x, y, z,
new int[] { 0, 0, wdwWidth, wdwHeight }, new Vector3f());
JavaDoc: https://joml-ci.github.io/JOML/apidocs/org/joml/Matrix4f.html#unproject-float-float-float-int:A-org.joml.Vector3f-
In general you should use Matrix4f.unproject
, so as @httpdigest demonstrated in his answer.
So I want to focus on the background:
Why are the coordinates x, y the same?
The coordinates are the same, because the x
and y
coordinates of the source are the same and you do no perspective divide after the multiplication by the inverse projection matrix. The operation mul
performs no perspective divide, it transforms a Vector4f
by a Matrix4f
and the result is a of type Vector4f
too.
Further, the source coordinates, which you multiply by the inverse projection matrix have to be normalized device coordinates, where x
, y
and z
is in range [-1.0, 1.0]. z=-1.0
is the minimum depth (near plane) and z=1.0
is the maximum depth (far plane).
A note, the projection of the line of sight (view ray) onto the viewport is a point, of course.
When you perform a multiplication by a (inverse) projection matrix, then the result is not a Cartesian coordinates, but it's a Homogeneous coordinates.
You have to perform a Perspective divide to convert from Homogeneous coordinates to Cartesian coordinates:
// transform x and y mouse coordinate to normalized device space
Vector2d mousePos = mouseInput.getCurrentPos();
float x = (float)(2 * mousePos.x) / (float)wdwWitdh - 1.0f;
float y = 1.0f - (float)(2 * mousePos.y) / (float)wdwHeight;
....
// normalized device coordinate to view space coordinate (near plane)
Vector4f tmpVec = new Vector4f();
tmpVec.set(x, y, -1.0f, 1.0f);
tmpVec.mul(invProjectionMatrix);
// perspective divide
tmpVec.x = tmpVec.x / tmpVec.w;
tmpVec.y = tmpVec.y / tmpVec.w;
tmpVec.z = tmpVec.z / tmpVec.w;
tmpVec.w = 1.0;
// normalized device coordinate to view space coordinate (far plane)
Vector4f tmpVec1 = new Vector4f();
tmpVec1.set(x, y, 1.0f, 1.0f);
tmpVec1.mul(invProjectionMatrix);
// perspective divide
tmpVec1.x = tmpVec1.x / tmpVec1.w;
tmpVec1.y = tmpVec1.y / tmpVec1.w;
tmpVec1.z = tmpVec1.z / tmpVec1.w;
tmpVec1.w = 1.0;
The projection matrix describes the mapping from 3D points of a scene, to 2D points of the viewport. The projection matrix transforms from view space to the clip space. The coordinates in the clip space are transformed to the normalized device coordinates (NDC) in the range (-1, -1, -1) to (1, 1, 1) by dividing with the w
component of the clip coordinates.
At Perspective Projection the projection matrix describes the mapping from 3D points in the world as they are seen from of a pinhole camera, to 2D points of the viewport.
The eye space coordinates in the camera frustum (a truncated pyramid) are mapped to a cube (the normalized device coordinates).
Because, of that, the x
and y
coordinate on the viewport depends on the depth (view space z
coordinate).