I am looking to downscale a UIImage
in iOS
.
I have seen other questions below and their approach on how to downscale the image by size.
Resizing Images Objective-C
How to resize the image programmatically in objective-c in iphone
The simplest way to resize an UIImage?
These questions are all based on re-sizing the image to a specific size. In my case I am looking to re-size/downscale the image based on a maximum size.
As an example, I would like to set a maximum NSData
size to be 500 KB. I know that I can get the size of the image like this:// Check the size of the image returned
NSData *imageData = UIImageJPEGRepresentation(image, 0.5);
// Log out the image size
NSLog(@"%lu KB",(imageData.length/1024));
What I would like to do is some form of loop here. If the size is greater than the maximum size that I set, I would like to scale down the image slightly, then check the size. If the size is still too large scale down again slightly then check again, until it is lower than the maximum set size.
I am not sure what the best approach for this is. Ideally I do not want to scale down the image to a specific size all the time, but only slightly scale down the image each time. That way I can have the largest (size w/h) of the image itself and at its maximum size (bytes). If I scale down slightly only at a time, what would be the best way to accomplish this?
EDIT
To confirm, I am looking to re-size the actual image but re-size the image so that it is smaller than the maximum NSData
Length.
For example:
-Check the NSData
Length
-If above the maximum I want to pass the UIImage into a method
-Then loop through this method slightly re-sizing the actual image size each time
-Until it is under the maximum NSData
length, then return the image?
It maintains image quality not lass than 1MB.
Call it with,
Besides the maximum size you also need to choose a minimum size as well as decide on performance. For example, you could check the size of
UIImageJPEGRepresentation(image, 1.0)
. If too big, do you then check at0.95
or0.1
?One possible approach is to get the size of
UIImageJPEGRepresentation(image, 1.0)
and see by what percent it is too big. For example, say it is 600kB. You should then compute500.0 / 600
which is roughly0.83
. So then doUIImageJPEGRepresentation(image, 0.83)
. That won't give exactly 500kB but it may be close enough.Another approach would be to start with
UIImageJPEGRepresentation(image, 1.0)
. It it's too big then doUIImageJPEGRepresentation(image, 0.5)
If too big then go with0.25
but if too small go with0.75
. Keep splitting the difference until you get within an acceptable range of your desired size.Right now, you have a routine that says:
I wouldn't advise recursively resizing the image. Every time you resize, you lose some quality (often manifesting itself as a "softening" of the image with loss of detail, with cumulative effects). You always want to go back to original image and resize that smaller and smaller. (As a minor aside, that
if
statement is redundant, too.)I might suggest the following:
Note, I'm not touching
image
, the original image, but rather assigningcurrentImage
by doing a resize from the original image each time, by a decreasing scale each time.BTW, if you're wondering about my cryptic
1.0 / sqrt(2.0)
, I was trying to draw a compromise between your iterative 80% factor and my desire to favor resizing by a power of 2 where I can (because a reduction retains more sharpness when done by a power of 2). But use whateveradjustment
factor you want.Finally, if you're doing this on huge images, you might think about using
@autoreleasepool
blocks. You'll want to profile your app in Allocations in Instruments and see where your high water mark is, as in the absence of autorelease pools, this may constitute a fairly aggressive use of memory.This was my approach:
The resize image method is as follows:
Followed by:
// Additional methods for reference
In your revised question, you clarified that your goal was to say within file size limitations while uploading images. In that case, playing around with JPEG compression options is fine as suggested by rmaddy.
The interesting question is that you have two variables to play around with, JPEG compression and image dimensions (there are others, too, but I'll keep it simple). How do you want to prioritize one over the other? For example, I don't think it makes sense to keep a full resolution, absurdly compressed image (e.g. 0.1 quality factor). Nor does it make sense to keep a tiny resolution, uncompressed image. Personally, I'd iteratively adjust quality as suggested by rmaddy, but set some reasonable floor (e.g. JPEG quality not less than, say 0.70). At that point, I might consider changing the image dimensions (and that changes file size pretty quickly, too), and altering the dimensions until the resulting
NSData
was an appropriate size.Anyway, in my original answer, I focused on the memory consumption within the app (as opposed to file size). For posterity's sake, see that answer below:
If you are trying to control how much memory is used when you load the images into
UIImage
objects to be used inUIKit
objects, then playing around with JPEG compression won't help you much, because the internal representation of the images once you load them intoUIKit
objects is uncompressed. Thus, in that scenario, JPEG compression options doesn't accomplish much (other than sacrificing image quality).To illustrate the idea, I have an image that is 1920 x 1080. I have it in PNG format (the file is 629kb), a compressed JPEG format (217kb), and a minimally compressed JPEG format (1.1mb). But, when I load those three different images into
UIImageView
objects (even if they have a very smallframe
), Instrument's "Allocations" tool shows me that they're each taking up 7.91mb:This is because when you load the image into an image view, the internal, uncompressed representation of these three images is four bytes per pixel (a byte for red, one for green, one for blue, and one for alpha). Thus a my 1920 x 1080 images take up 1920 x 1080 x 4 = 8,249,400 = 7.91mb.
So, if you don't want them to take up more than 500kb in memory when loading them into image view objects, that means that you want to resize them such that the product of the width times the height will be 128,000 or less (i.e. if square, less than 358 x 358 pixels).
But, if your concern is one of network bandwidth as you upload images or persistent storage capacity, then go ahead and play around with JPEG compression values as suggested by rmaddy's excellent answer. But if you're trying to address memory consumption issues while the images are loaded into UIKit objects, then don't focus on compression, but focus on resizing the image.