I am trying to copy an image from UIImagePicker
to documents directory. I am using @"UIImagePickerControllerOriginalImage"
key to get the original image from the dictionary of UIImagePickerDelegate. I am writing the image to file using UIImagePNGRepresentation
. When I am adding(repeating the process) images with high resolution(image size approx 20 mb) I am facing memory issues.
I profiled and used the memory leak feature of the Xcode and it zoomed in on the following piece of code, which is responsible for the leakage.
@autoreleasepool {
imagesData = UIImagePNGRepresentation(images);
[imagesData writeToFile:name atomically:NO];
imagesData = nil;
//[UIImageJPEGRepresentation(images, 1.0) writeToFile:name atomically:YES];
}
I have seen many questions here, regarding memory leaks caused by UIImagePNGRepresentation
. But I haven't found a proper solution to my problem. Need help.
I'm unaware of any "leak" with
UIImagePNGRepresentation
, but it certainly is an extravagant use of memory, but there are a couple of issues here:First, the process of round-tripping the original asset through a
UIImage
and then usingUIImagePNGRepresentation()
is fairly inefficient and can end up with aNSData
that is considerably larger than the original asset. For example, I picked a photo whose original asset 1.5mb, theUIImageJPEGRepresentation
(withcompressionQuality
was 1.0) was 6mb, and theUIImagePNGRepresentation()
was about 10mb. (These numbers can change quite a bit from image to image, but you get the basic idea.)You can often mitigate this problem by using
UIImageJPEGRepresentation
with acompressionQuality
of less than 1.0 (e.g. 0.8 or 0.9 offers minimal image quality loss, but yet an observable reduction inNSData
site). But this is a lossy compression. Furthermore, you lose some image meta data in this process.I believe you are holding multiple copies of the same image in memory at the same time: You have both the
UIImage
representation as well as theNSData
object.Not only is the
NSData
representation of the asset larger than it needs to be, you're also loading the entire asset into memory at one time. This is not necessary.You might consider, instead, streaming the original asset from the
ALAssetLibrary
directly to persistent memory, without usingUIImagePNGRepresentation
orUIImageJPEGRepresentation
, and without loading it into aUIImage
at all. Instead, create a small buffer, repeatedly filling this buffer with portions of the original asset viagetBytes
, writing this small buffer to a temporary file usingNSOutputStream
as you go along. You can repeat that process until the entire asset is written to persistent storage. The total memory footprint of this process is much lower than the alternative approaches.For example:
I solve this issue by sending a 4 channels image (RGBA or RGBX) instead of a 3 channels image (RGB). You can check if there's any chance to change parameters of your image.
Use
kCGImageAlphaNoneSkipLast
instead ofkCGImageAlphaNone
.