I am trying to find a way to read and write JPEG images to user gallery (camera roll) without iOS re-compressing them. UIImage seems to be the bottleneck here. The only method for saving to user gallery I've found is UIImageWriteToSavedPhotosAlbum(). Is there a way around this?
For now my routine looks like this
–Ask UIImagePickerController for a photo. And when it didFinishPickingMediaWithInfo, do:
NSData *imgdata = [NSData dataWithData:UIImageJPEGRepresentation([info objectForKey:@"UIImagePickerControllerOriginalImage"], 1)];
[imgdata writeToFile:filePath atomically:NO];
–Process JPEG losslessly on disk.
–Then save it back:
UIImageWriteToSavedPhotosAlbum([UIImage imageWithContentsOfFile:[self getImagePath]], self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
Here is a tiny animation of what quality degradation looks like after 3 passes:
It obviously gets worse each time I do this, but I couldn't automate the image picking part in order to fully test it for 50/100/1000 cycles.
UIImage
decodes the image data so it can be edited and displayed, sowould first decode the image, and than encode it back by the
UIImageWriteToSavedPhotosAlbum
method.Instead, you should use ALAssetsLibrary/writeImageDataToSavedPhotosAlbum:metadata:completionBlock:, something like this:
You may also pass metadata and the completion block to the call.
EDIT:
For getting the image:
[info objectForKey:@"UIImagePickerControllerOriginalImage"]
contains the decodedUIImage
selected fromUIImagePickerController
. You should instead useUsing the
assetURL
you can now get theALAsset
for it using the ALAssetsLibrary/assetForURL:resultBlock:failureBlock: method:You can now get the unaltered NSData of that image:
I may have made some mistakes in the code, but the steps are as listed above. In case of something not working right, or if you want to make this a little more efficient, there are plenty of examples of doing stuff like reading
NSData
from anALAsset
on stackoverflow or other sites.