I need to get the GPS coordinates of an image taken with the iOS device's camera. I do not care about the Camera Roll images, just the image taken with UIImagePickerControllerSourceTypeCamera.
I've read many stackoverflow answers, like Get Exif data from UIImage - UIImagePickerController, which either assumes you are using the AssetsLibrary framework, which doesn't seem to work on camera images, or use CoreLocaiton to get the latitude/longitude from the app itself, not from the image.
Using CoreLocation is not an option. That will not give me the coordinates when the shutter button was pressed. (With the CoreLocation based solutions, you either need to record the coords before you bring up the camera view or after, and of course if the device is moving the coordinates will be wrong. This method should work with a stationary device.)
I am iOS5 only, so I don't need to support older devices. This is also for a commercial product so I cannot use http://code.google.com/p/iphone-exif/.
So, what are my options for reading the GPS data from the image returned by the camera in iOS5? All I can think of right now is to save the image to Camera Roll and then use the AssetsLibrary, but that seems hokey.
Thanks!
Here's the code I wrote based on Caleb's answer.
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
NSData *jpeg = UIImageJPEGRepresentation(image,1.0);
CGImageSourceRef source ;
source = CGImageSourceCreateWithData((__bridge CFDataRef)jpeg, NULL);
NSDictionary *metadataNew = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source,0,NULL);
NSLog(@"%@",metadataNew);
and my Console shows:
2012-04-26 14:15:37:137 ferret[2060:1799] {
ColorModel = RGB;
Depth = 8;
Orientation = 6;
PixelHeight = 1936;
PixelWidth = 2592;
"{Exif}" = {
ColorSpace = 1;
PixelXDimension = 2592;
PixelYDimension = 1936;
};
"{JFIF}" = {
DensityUnit = 0;
JFIFVersion = (
1,
1
);
XDensity = 1;
YDensity = 1;
};
"{TIFF}" = {
Orientation = 6;
};
}
No latitude/longitude.
As point out by Chris Markle, Apple does strip out GPS data from EXIF. But you can open the RAW data of the image, and parse the data yourself or use a third party lib to do that, for example.
Here is a sample code:
One possibility is to leaving CoreLocation running when the camera is visible. Record each CCLocation into an array along with the time of the sample. When the photo comes back, find its time, then match the closest CClocation from the array.
Sounds kludgy but it will work.
Can't say I've needed to do exactly this in my own stuff, but from the docs it seems pretty clear that if you're using
UIImagePickerController
you can get the image that the user just took from the-imagePicker:didFinishPickingMediaWithInfo:
delegate method. Use the keyUIImagePickerControllerOriginalImage
to get the image.Once you've got the image, you should be able to access its properties, including EXIF data, as described in QA1654 Accessing image properties with ImageIO. To create the CGImageSource, I'd look at
CGImageSourceCreateWithData()
and use the data that you get from the UIImage'sCGImage
method. Once you've got the image source, you can access the various attributes viaCGImageSourceCopyProperties()
.You're not using the image data from the camera in the code you've posted, you've generated a JPEG representation of it, which would essentially discard all the metadata. Use
image.CGImage
like Caleb suggested.Also:
The author quite clearly states that commercial licensing is available.
This is tested on iOS 8 and works for videos so it should work similarly for photos with a few tweaks.
Swift answer: