I am trying to find a way to read and write JPEG images to user gallery (camera roll) without iOS re-compressing them.
UIImage seems to be the bottleneck here. The only method for saving to user gallery I've found is UIImageWriteToSavedPhotosAlbum(). Is there a way around this?
For now my routine looks like this
–Ask UIImagePickerController for a photo. And when it didFinishPickingMediaWithInfo, do:
NSData *imgdata = [NSData dataWithData:UIImageJPEGRepresentation([info objectForKey:@"UIImagePickerControllerOriginalImage"], 1)];
[imgdata writeToFile:filePath atomically:NO];
–Process JPEG losslessly on disk.
–Then save it back:
UIImageWriteToSavedPhotosAlbum([UIImage imageWithContentsOfFile:[self getImagePath]], self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
Here is a tiny animation of what quality degradation looks like after 3 passes:
It obviously gets worse each time I do this, but I couldn't automate the image picking part in order to fully test it for 50/100/1000 cycles.
UIImage
decodes the image data so it can be edited and displayed, so
UIImageWriteToSavedPhotosAlbum([UIImage imageWithContentsOfFile:[NSData dataWithContentsOfFile:[self getImagePath]]], self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
would first decode the image, and than encode it back by the UIImageWriteToSavedPhotosAlbum
method.
Instead, you should use ALAssetsLibrary/writeImageDataToSavedPhotosAlbum:metadata:completionBlock:, something like this:
ALAssetsLibrary *assetLib = [[[ALAssetsLibrary alloc] init] autorelease];
[assetLib writeImageDataToSavedPhotosAlbum:[self getImagePath] metadata:nil completionBlock:nil];
You may also pass metadata and the completion block to the call.
EDIT:
For getting the image:
[info objectForKey:@"UIImagePickerControllerOriginalImage"]
contains the decoded UIImage
selected from UIImagePickerController
. You should instead use
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
Using the assetURL
you can now get the ALAsset
for it using the ALAssetsLibrary/assetForURL:resultBlock:failureBlock: method:
ALAssetsLibrary *assetLib = [[[ALAssetsLibrary alloc] init] autorelease];
[assetLib assetForURL:assetURL resultBlock:resultBlock failureBlock:failureBlock];
You can now get the unaltered NSData of that image:
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *asset){
ALAssetRepresentation *assetRep = [asset defaultRepresentation];
long long imageDataSize = [assetRepresentation size];
uint8_t* imageDataBytes = malloc(imageDataSize);
[assetRepresentation getBytes:imageDataBytes fromOffset:0 length:imageDataSize error:nil];
NSData *imageData = [NSData dataWithBytesNoCopy:imageDataBytes length:imageDataSize freeWhenDone:YES]; // you could for instance read data in smaller buffers and append them to your file instead of reading it all at once
// save it
[imgdata writeToFile:filePath atomically:NO];
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror){
NSLog(@"Cannot get image - %@",[myerror localizedDescription]);
//
};
I may have made some mistakes in the code, but the steps are as listed above. In case of something not working right, or if you want to make this a little more efficient, there are plenty of examples of doing stuff like reading NSData
from an ALAsset
on stackoverflow or other sites.