How does the "move and scale screen" determine dimensions for its cropbox?
Basically I would like to set a fixed width and height for the "CropRect" and let the user move and scale his image to fit in to that box as desired.
Does anyone know how to do this? (Or if it is even possible with the UIImagePickerController)
Thanks!
Not possible with UIImagePickerController unfortunately. The solution I recommend is to disable editing for the image picker and handle it yourself. For instance, I put the image in a scrollable, zoomable image view. On top of the image view is a fixed position "crop guide view" that draws the crop indicator the user sees. Assuming the guide view has properties for the visible rect (the part to keep) and edge widths (the part to discard) you can get the cropping rectangle like so. You can use the UIImage+Resize category to do the actual cropping.
CGRect cropGuide = self.cropGuideView.visibleRect;
UIEdgeInsets edges = self.cropGuideView.edgeWidths;
CGPoint cropGuideOffset = self.cropScrollView.contentOffset;
CGPoint origin = CGPointMake( cropGuideOffset.x + edges.left, cropGuideOffset.y + edges.top );
CGSize size = cropGuide.size;
CGRect crop = { origin, size };
crop.origin.x = crop.origin.x / self.cropScrollView.zoomScale;
crop.origin.y = crop.origin.y / self.cropScrollView.zoomScale;
crop.size.width = crop.size.width / self.cropScrollView.zoomScale;
crop.size.height = crop.size.height / self.cropScrollView.zoomScale;
photo = [photo croppedImage:crop];
Kinda late to the game but I think this may be what you are looking for: https://github.com/gekitz/GKImagePicker
Here is a solution for manual cropping by Ming Yang.
https://github.com/myang-git/iOS-Image-Crop-View
It offers a rectangular frame, which the user can slide or drag to fit the required portion of the image in the rectangle. Please note that this solution does the reverse of the question asked - lets the rectangle size vary, but eventually brings the desired result.
It is coded in Objective-C. You may have to either code it in Swift or simply build a bridging header to connect the Objective-C code with Swift code.
It's now later than late but may be useful for someone. This is the library I've used for swift (many thanks to Tim Oliver):
TOCropViewController
as described in README file in GitHub link above, by using this library you can get cropped images in user-defined rectangular and also in a circular mode, e.g. for updating profile image.
below is sample code from GitHub:
func presentCropViewController {
let image: UIImage = ... //Load an image
let cropViewController = CropViewController(image: image)
cropViewController.delegate = self
present(cropViewController, animated: true, completion: nil)
}
func cropViewController(_ cropViewController: CropViewController, didCropToImage image: UIImage, withRect cropRect: CGRect, angle: Int) {
// 'image' is the newly cropped version of the original image
}