This is probably a very stupid math question, but i can't seem to figure it out. What i have is a circle at point A that i can click on and drag the mouse oway from it. When the mouse is released - the release point B is considered a target point and the ball has to move in that direction. What i'm doing now is something like this:
velocityX = (b.x - a.x) / somenumber
velocityY = (b.y - a.y) / somenumber
This allows me to use different "shot" speeds the further away the mouse is released from the circle. But now i realised that i don't like this idea and instead i want to do it the following way:
- to have a minimum and maximum speed (pixels per animation frame)
- to select the speed from this interval prior to the shot
- to use the point B simply for easier targeting. The shot speed is preselected and it should't depend on how far the mouse is released
I know it should be dead simple, but how do i (knowing point A and B coordinates, min, max and selected velocity) set x and y velocities to the circle taking into account the direction of the shot?