I'm trying to find a rotated rectangle UIView's four corners' coordinates.
I think one way I can do is to use recognizer.rotation, find the rotated angle then calculate the origins. But that requires some geometry calculation.
- (IBAction)handlePan:(UIRotationGestureRecognizer*)recognizer {
NSLog(@"Rotation in degrees since last change: %f", [recognizer rotation] * (180 / M_PI));
recognizer.view.transform = CGAffineTransformRotate(recognizer.view.transform, recognizer.rotation);
NSLog(@"%@",recognizer);
recognizer.rotation = 0;
NSLog(@"bound is %f and %f, frame is %f and %f, %f and %f.",recognizer.view.bounds.size.width,recognizer.view.bounds.size.height, recognizer.view.frame.size.width,recognizer.view.frame.size.height, recognizer.view.frame.origin.x, recognizer.view.frame.origin.y);
}
I'm just wondering if there are any other easier ways to get the coordinates? Thanks!
EDIT:
Looks like we have a great answer here(see answer below). I have managed to calculate the corners through a stupid way -- using rotation angle and geometry. It works but not easy and light. I'm sharing my code here just in case some one may want to use it(Even though I doubt it.)
float r = 100;
NSLog(@"radius is %f.",r);
float AAngle = M_PI/3+self.rotatedAngle;
float AY = recognizer.view.center.y - sin(AAngle)*r;
float AX = recognizer.view.center.x - cos(AAngle)*r;
self.pointPADA = CGPointMake(AX, AY);
NSLog(@"View Center is (%f,%f)",recognizer.view.center.x,recognizer.view.center.y);
NSLog(@"Point A has coordinate (%f,%f)",self.pointPADA.x,self.pointPADA.y);
float BAngle = M_PI/3-self.rotatedAngle;
float BY = recognizer.view.center.y - sin(BAngle)*r;
float BX = recognizer.view.center.x + cos(BAngle)*r;
self.pointPADB = CGPointMake(BX, BY);
NSLog(@"Point B has coordinate (%f,%f)",BX,BY);
float CY = recognizer.view.center.y + sin(AAngle)*r;
float CX = recognizer.view.center.x + cos(AAngle)*r;
self.pointPADC = CGPointMake(CX, CY);
NSLog(@"Point C has coordinate (%f,%f)",CX,CY);
float DY = recognizer.view.center.y + sin(BAngle)*r;
float DX = recognizer.view.center.x - cos(BAngle)*r;
self.pointPADD = CGPointMake(DX, DY);
NSLog(@"Point D has coordinate (%f,%f)",DX,DY);
Here's my solution though I wonder if there's a more succinct way:
Checked answer in Swift 3.1