I've been bashing my face into this one for literally days now and even though I feel constantly that I am right on the edge of revelation, I simply cannot achieve my goal.
I thought, ahead of time in the conceptual phases of my design, that it would be a trivial matter to grab a image from the iPhone's camera or library, scale it down to a specified height, using a function equivalent to the Aspect Fill option of UIImageView (entirely in code), and then crop off anything that did not fit within a passed CGRect.
Getting the original image from camera or library, was trivial. I am shocked at how difficult the other two steps have proved to be.
The attached image shows what I am trying to achieve. Would someone please be kind enough to hold my hand? Every code example I have found so far seems to smash the image, be upside down, look like crap, draw out of bounds, or otherwise just not work correctly.
Below the example of work: Left image - (origin image) ; Right image with scale x2
If you want to scale image but retain its frame(proportions), call method this way:
This question seems to have been put to rest, but in my quest for a solution that I could more easily understand (and written in Swift), I arrived at this (also posted to: How to crop the UIImage?)
I wanted to be able to crop from a region based on an aspect ratio, and scale to a size based on a outer bounding extent. Here is my variation:
There are a couple things that I found confusing, the separate concerns of cropping and resizing. Cropping is handled with the origin of the rect that you pass to drawInRect, and scaling is handled by the size portion. In my case, I needed to relate the size of the cropping rect on the source, to my output rect of the same aspect ratio. The scale factor is then output / input, and this needs to be applied to the drawRect (passed to drawInRect).
One caveat is that this approach effectively assumes that the image you are drawing is larger than the image context. I have not tested this, but I think you can use this code to handle cropping / zooming, but explicitly defining the scale parameter to be the aforementioned scale parameter. By default, UIKit applies a multiplier based on the screen resolution.
Finally, it should be noted that this UIKit approach is higher level than CoreGraphics / Quartz and Core Image approaches, and seems to handle image orientation issues. It is also worth mentioning that it is pretty fast, second to ImageIO, according to this post here: http://nshipster.com/image-resizing/
I converted Sam Wirch's guide to swift and it worked well for me, although there's some very slight "squishing" in the final image that I couldn't resolve.
If anyone wants the objective c version, it's on his website.
I needed the same thing - in my case, to pick the dimension that fits once scaled, and then crop each end to fit the rest to the width. (I'm working in landscape, so might not have noticed any deficiencies in portrait mode.) Here's my code - it's part of a categeory on UIImage. Target size in my code is always set to the full screen size of the device.