I would like to write a function that takes a 3d triangle (as 3 points (vector3ds)) and returns a 2d triangle (as 3 points (vector2ds)):
When given a 3d triangle, it should return the 2 dimensional coordinates of its points as they lie on its plane. (By 'its plane' I mean the plane that all three points lie on).
I can think a long winded way todo this:
- rotate the triangle until its normal is equal to +z (0,0,1), then construct a triangle from the (x, y) coords of each point.
I cant help but think there must be an easier way to achieve the same thing.
If posting code examples please try not to use Greek alphabet. Some pseudo code in a C/java style language would be ideal.
There is no singular answer to this problem.
The plane in which the triangle sits has no defined origin, and no defined orientation.
The best you can do is define one of the vertices as the origin, and one of the edges laying along the X axis:
You will need to calculate vectors
A
(i.e.v2 - v1
) andB
(i.e.v3 - v1
).Vertex 2 will then be at:
The position of vertex 3 can be worked out by using the vector cross product rule, e.g.:
So, work out
A x B
and from that you can work out the sine of the angle theta betweenA
andB
:Vertex 3 is then at coordinates:
You can take advantage of
cos(theta) = sqrt(1 - sin(theta) ^ 2)
to avoid any inverse trig operations.You should also see that
|B| sin(theta)
is just| A x B | / | A |
From your comments I infer that you can choose the coordinate system of the plane in an arbitrary way, as long as the Euclidean metric of this coordinate system is the same as the metric induced by the Euclidean metric of your three-dimensional coordinate system. (That is, Euclidean distances will stay the same.)
One possible solution: