Has anybody idea how to fetch squared thumbs from PHImageManager? PHImageContentModeAspectFill option has no effect.
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)_asset
targetSize:CGSizeMake(80, 80)
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
// sadly result is not a squared image
imageView.image = result;
}];
Update:
The bug in cropping images as they were retrieved from PHImageManager
was fixed in iOS 8.3, so for that version of iOS and later, my original example works, as follows:
It seems the bugs are still there up to and including iOS 8.4, I can reproduce them with standard iPhone 6s back camera images, taking a full size square crop. They are properly fixed in iOS 9.0, where even large crops of a 63 megapixel panorama work fine.
The approach Apple defines is to pass a CGRect
in the co-ordinate space of the image, where the origin is (0,0) and the maximum is (1,1). You pass this rect in the PHImageRequestOptions
object, along with a resizeMode
of PHImageRequestOptionsResizeModeExact
, and then you should get back a cropped image.
- (void)showSquareImageForAsset:(PHAsset *)asset
{
NSInteger retinaScale = [UIScreen mainScreen].scale;
CGSize retinaSquare = CGSizeMake(100*retinaScale, 100*retinaScale);
PHImageRequestOptions *cropToSquare = [[PHImageRequestOptions alloc] init];
cropToSquare.resizeMode = PHImageRequestOptionsResizeModeExact;
CGFloat cropSideLength = MIN(asset.pixelWidth, asset.pixelHeight);
CGRect square = CGRectMake(0, 0, cropSideLength, cropSideLength);
CGRect cropRect = CGRectApplyAffineTransform(square,
CGAffineTransformMakeScale(1.0 / asset.pixelWidth,
1.0 / asset.pixelHeight));
cropToSquare.normalizedCropRect = cropRect;
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)asset
targetSize:retinaSquare
contentMode:PHImageContentModeAspectFit
options:cropToSquare
resultHandler:^(UIImage *result, NSDictionary *info) {
self.imageView.image = result;
}];
}
This example makes its cropRect
of side length equal to the smaller of the width and height of the asset, and then transforms it to the co-ordinate space of the image using CGRectApplyAffineTransform
. You may want to set the origin of square
to something other than (0,0), as often you want the crop square centred along the axis of the image which is being cropped, but I'll leave that as an exercise for the reader. :-)
Original Answer:
John's answer got me most of the way there, but using his code I was getting stretched and squashed images. Here's how I got an imageView
to display square thumbnails fetched from the PHImageManager.
Firstly, ensure that the contentMode
property for your UIImageView
is set to ScaleAspectFill
. The default is to ScaleToFill
, which doesn't work correctly for displaying square thumbnails from PHImageManager
, so make sure you change this whether you've instantiated the UIImageView
in code or in the storyboard.
//view dimensions are based on points, but we're requesting pixels from PHImageManager
NSInteger retinaMultiplier = [UIScreen mainScreen].scale;
CGSize retinaSquare = CGSizeMake(imageView.bounds.size.width * retinaMultiplier, imageView.bounds.size.height * retinaMultiplier);
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)_asset
targetSize:retinaSquare
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
// The result is not square, but correctly displays as a square using AspectFill
imageView.image = result;
}];
Specifying PHImageRequestOptionsResizeModeExact
for the resizeMode
is not required, as it will not give you a cropped image unless you also supply a normalizedCropRect
, and should not be used here as there's no benefit, and using it means you don't get the benefits of quickly returned cached images.
The UIImage
returned in result
will be the same aspect ratio as the source, but scaled correctly for use in a UIImageView
which is set to aspect fill to display as a square, so if you're just displaying it, this is the way to go. If you need to crop the image for print or export outside of the app, this isn't what you want - look into the use of normalizedCropRect
for that. (edit- see below for example of what should work...)
Except this also make sure that the you set the content mode of the UIImageView to UIViewContentModeScaleAspectFill and that you set clipsToBounds = YES by the following 2 lines :
imageView.contentMode=UIViewContentModeScaleAspectFill;
imageView.clipsToBounds=YES;
Edit to add normalizedCropRect usage example
WARNING - this doesn't work, but should according to Apple's documentation.
The approach Apple defines is to pass a CGRect
in the co-ordinate space of the image, where the origin is (0,0) and the maximum is (1,1). You pass this rect in the PHImageRequestOptions
object, along with a resizeMode
of PHImageRequestOptionsResizeModeExact
, and then you should get back a cropped image. The problem is that you don't, it comes back as the original aspect ratio and the full image.
I've verified that the crop rect is created correctly in the image's co-ordinate space, and followed the instruction to use PHImageRequestOptionsResizeModeExact
, but the result handler will still be passed an image in the original aspect ratio. This seems to be a bug in the framework, and when it is fixed, the following code should work.
- (void)showSquareImageForAsset:(PHAsset *)asset
{
NSInteger retinaScale = [UIScreen mainScreen].scale;
CGSize retinaSquare = CGSizeMake(100*retinaScale, 100*retinaScale);
PHImageRequestOptions *cropToSquare = [[PHImageRequestOptions alloc] init];
cropToSquare.resizeMode = PHImageRequestOptionsResizeModeExact;
CGFloat cropSideLength = MIN(asset.pixelWidth, asset.pixelHeight);
CGRect square = CGRectMake(0, 0, cropSideLength, cropSideLength);
CGRect cropRect = CGRectApplyAffineTransform(square,
CGAffineTransformMakeScale(1.0 / asset.pixelWidth,
1.0 / asset.pixelHeight));
cropToSquare.normalizedCropRect = cropRect;
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)asset
targetSize:retinaSquare
contentMode:PHImageContentModeAspectFit
options:cropToSquare
resultHandler:^(UIImage *result, NSDictionary *info) {
self.imageView.image = result;
}];
}
All I can suggest is that if you have this problem, you file a radar with Apple to request that they fix it!
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.resizeMode = PHImageRequestOptionsResizeModeExact;
NSInteger retinaMultiplier = [UIScreen mainScreen].scale;
CGSize retinaSquare = CGSizeMake(imageView.bounds.size.width * retinaMultiplier, imageView.bounds.size.height * retinaMultiplier);
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)_asset
targetSize:retinaSquare
contentMode:PHImageContentModeAspectFill
options:options
resultHandler:^(UIImage *result, NSDictionary *info) {
imageView.image =[UIImage imageWithCGImage:result.CGImage scale:retinaMultiplier orientation:result.imageOrientation];
}];
To get an exact square, you'll have to indicate that you want an exact size by passing options, like so:
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
// No, really, we want this exact size
options.resizeMode = PHImageRequestOptionsResizeModeExact
[[PHImageManager defaultManager]
requestImageForAsset:(PHAsset *)_asset
targetSize:CGSizeMake(160, 160)
contentMode:PHImageContentModeAspectFill
options:options
resultHandler:^(UIImage *result, NSDictionary *info) {
// Happily, result is now a squared image
imageView.image = result;
}];
This is working fine for me :
__block UIImage* imgThumb;
CGSize size=CGSizeMake(45, 45);// Size for Square image
[self.imageManager requestImageForAsset:<your_current_phAsset>
targetSize:size
contentMode:PHImageContentModeAspectFill
options:nil
resultHandler:^(UIImage *result, NSDictionary *info) {
imgThumb = result;
}];