Memory Leak - UIImagePNGRepresentation

2019-07-13 01:09发布

I am trying to copy an image from UIImagePicker to documents directory. I am using @"UIImagePickerControllerOriginalImage" key to get the original image from the dictionary of UIImagePickerDelegate. I am writing the image to file using UIImagePNGRepresentation. When I am adding(repeating the process) images with high resolution(image size approx 20 mb) I am facing memory issues.

I profiled and used the memory leak feature of the Xcode and it zoomed in on the following piece of code, which is responsible for the leakage.

@autoreleasepool {
    imagesData = UIImagePNGRepresentation(images);
    [imagesData writeToFile:name atomically:NO];
    imagesData = nil;
    //[UIImageJPEGRepresentation(images, 1.0) writeToFile:name atomically:YES];
}

I have seen many questions here, regarding memory leaks caused by UIImagePNGRepresentation. But I haven't found a proper solution to my problem. Need help.

2条回答
一夜七次
2楼-- · 2019-07-13 01:59

I'm unaware of any "leak" with UIImagePNGRepresentation, but it certainly is an extravagant use of memory, but there are a couple of issues here:

  1. First, the process of round-tripping the original asset through a UIImage and then using UIImagePNGRepresentation() is fairly inefficient and can end up with a NSData that is considerably larger than the original asset. For example, I picked a photo whose original asset 1.5mb, theUIImageJPEGRepresentation (with compressionQuality was 1.0) was 6mb, and the UIImagePNGRepresentation() was about 10mb. (These numbers can change quite a bit from image to image, but you get the basic idea.)

    You can often mitigate this problem by using UIImageJPEGRepresentation with a compressionQuality of less than 1.0 (e.g. 0.8 or 0.9 offers minimal image quality loss, but yet an observable reduction in NSData site). But this is a lossy compression. Furthermore, you lose some image meta data in this process.

  2. I believe you are holding multiple copies of the same image in memory at the same time: You have both the UIImage representation as well as the NSData object.

  3. Not only is the NSData representation of the asset larger than it needs to be, you're also loading the entire asset into memory at one time. This is not necessary.

You might consider, instead, streaming the original asset from the ALAssetLibrary directly to persistent memory, without using UIImagePNGRepresentation or UIImageJPEGRepresentation, and without loading it into a UIImage at all. Instead, create a small buffer, repeatedly filling this buffer with portions of the original asset via getBytes, writing this small buffer to a temporary file using NSOutputStream as you go along. You can repeat that process until the entire asset is written to persistent storage. The total memory footprint of this process is much lower than the alternative approaches.

For example:

static NSInteger kBufferSize = 1024 * 10;

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    NSURL *url = info[UIImagePickerControllerReferenceURL];

    [self.library assetForURL:url resultBlock:^(ALAsset *asset) {
        ALAssetRepresentation *representation = [asset defaultRepresentation];
        long long remaining = representation.size;
        NSString *filename  = representation.filename;

        NSString *documentsPath = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)[0];
        NSString *path = [documentsPath stringByAppendingPathComponent:filename];
        NSString *tempPath = [self pathForTemporaryFileWithPrefix:@"ALAssetDownload"];

        NSOutputStream *outputStream = [NSOutputStream outputStreamToFileAtPath:tempPath append:NO];
        NSAssert(outputStream, @"Unable to create output stream");

        [outputStream open];

        long long representationOffset = 0ll;
        NSError *error;

        uint8_t buffer[kBufferSize];

        while (remaining > 0ll) {
            NSInteger bytesRetrieved = [representation getBytes:buffer fromOffset:representationOffset length:sizeof(buffer) error:&error];
            if (bytesRetrieved < 0) {
                NSLog(@"failed getBytes: %@", error);
                [outputStream close];
                [[NSFileManager defaultManager] removeItemAtPath:tempPath error:nil];
                return;
            } else {
                remaining -= bytesRetrieved;
                representationOffset += bytesRetrieved;
                [outputStream write:buffer maxLength:bytesRetrieved];
            }
        }

        [outputStream close];

        if (![[NSFileManager defaultManager] moveItemAtPath:tempPath toPath:path error:&error]) {
            NSLog(@"Unable to move file: %@", error);
        }

    } failureBlock:^(NSError *error) {
        NSLog(@"assetForURL error = %@", error);
    }];
}

- (NSString *)pathForTemporaryFileWithPrefix:(NSString *)prefix
{
    NSString    *uuidString = [[NSUUID UUID] UUIDString];

    // If supporting iOS versions prior to 6.0, you can use:
    //
    // CFUUIDRef uuid = CFUUIDCreate(NULL);
    // assert(uuid != NULL);
    // NSString *uuidString = CFBridgingRelease(CFUUIDCreateString(NULL, uuid));
    // CFRelease(uuid);

    return [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"%@-%@", prefix, uuidString]];
}
查看更多
劫难
3楼-- · 2019-07-13 02:06

I solve this issue by sending a 4 channels image (RGBA or RGBX) instead of a 3 channels image (RGB). You can check if there's any chance to change parameters of your image.

Use kCGImageAlphaNoneSkipLast instead of kCGImageAlphaNone.

查看更多
登录 后发表回答