This is probably a very stupid math question, but i can't seem to figure it out. What i have is a circle at point A that i can click on and drag the mouse oway from it. When the mouse is released - the release point B is considered a target point and the ball has to move in that direction. What i'm doing now is something like this:
velocityX = (b.x - a.x) / somenumber
velocityY = (b.y - a.y) / somenumber
This allows me to use different "shot" speeds the further away the mouse is released from the circle. But now i realised that i don't like this idea and instead i want to do it the following way:
- to have a minimum and maximum speed (pixels per animation frame)
- to select the speed from this interval prior to the shot
- to use the point B simply for easier targeting. The shot speed is preselected and it should't depend on how far the mouse is released
I know it should be dead simple, but how do i (knowing point A and B coordinates, min, max and selected velocity) set x and y velocities to the circle taking into account the direction of the shot?
Just normalize the vector from the center of the circle to the point and then multiply by the speed you want. In any goodl vector library there is such a function, but just to clarify:
I can think of two ways to do it.
lets say the angle from a to b is T. Then:
T is equal to atan((b.y-a.y)/(b.x-a.x))
knowing T you can calculate the x and y velocities:
Vx = cos(T)V Vy = sin(T)V
That should work.
To make things quicker you could calculate cos(T) and sin(T) directly.
sin(T) gives the proportion y/h, where h is the length of the line between a and b.
We can calculate h using the Pythagorean theorem:
h = sqrt((b.y-a.y)^2 + (b.x-a.x)^2)
from this we can derive formulas for Vx and Vy
Vx = V * (b.x-a.x)/sqrt((b.y-a.y)^2 + (b.x-a.x)^2)
Vy = V * (b.x-a.x)/sqrt((b.y-a.y)^2 + (b.x-a.x)^2)
That's probably be faster, particularly if you have a built Pythagorean theorem function.