I'm sure this question has been duplicated many times, however I cannot find any answers for my specific problem!
I'm creating game in using html5 and THREE.js, and I have a camera that rotates with the euler order of 'YXZ'.
This is so the camera rotation up, down, left and right like a head. As in first person view.
With this euler order, the z property in camera.rotation is not used (always 0). x is used to specify pitch or latitude in radians, and y is used depict the longitude.
I have a number of targets moving around the user in a spherical manner, with the camera constantly at the center of this sphere. Lets say the sphere has a radius of 1000.
My aim is calculate the angle between where the camera is LOOKING, and where the target is.
I wrote this code that calculates the angle between 2 vectors:
var A = new THREE.Vector3(1,0,0);
var B = new THREE.Vector3(0,1,0);
var theta = Math.acos( A.dot(B) / A.length() / B.length() );
// theta = PI/2;
I have the vector of the target (targets[0].position).
The thing is I cannot figure out a way to get the vector of where the camera is looking.
I've looked at Vector3.applyProjection and applyEuler methods, and played around but still cannot get any good results.
I believe that the euler order must first be changed back to 'XYZ' before any projections are made? I've tried, and I'm not sure how to do this, the documentation is pretty hard to understand.
Say if the camera is looking directly right, then I need to get a vector that is RADIUS distance away from the center in the direction of the current rotation.
So I will end up with 2 vectors, which I can do calculations on!
Thank you in advanced!