I have a requirement in which the user can select an image from camera roll
and my application needs to optimize the image and upload it to a server. I am using ALAssets
library and a custom picker for the user to select a photo from camera roll
. I am able to achieve this perfectly but have a problem with the file size getting increased.
The problem is as follows:
When the user selects a photo, of 1.7MB (That is the full size as reported by the iOS
when I try to mail the photo), when I directly get the bytes from the ALAssetRepresentation
of that ALAsset
and convert it to NSData
the size of data is 1.7MB too. But, when I get a UIImage from the ALAssetRepresentation
using the following code..
UIImage *selImage = [UIImage imageWithCGImage:[assetRepresentation fullResolutionImage] scale:[assetRepresentation scale] orientation:[assetRepresentation orientation]];
and converting the UIImage
to NSData
using UIImageJPEGRepresentation()
with compression quality = 1.0f, the NSData size becomes almost double the original size (3.2MB).
Shouldn't it ideally be 1.7MB only??
As expected, when I save it to a file, the file size still says 3.2MB (I checked it using xcode 4.2 by importing the sandbox app's documents folder). But the interesting thing is when I open the file with Preview
on my mac, and check for the byte size, it says 1.7MB (was 3.2 MB) which is pretty confusing..
Please help..
Thanks in advance.
What compression quality are you specifying? Is this the same or different from what was used on the original image? If it is different, try it with the same compression factor and see if the size is the same.