How to do properly cropping of UIImage taken with

2019-05-29 18:48发布

This is camera overlay for my app,

enter image description here

The yellow square is to indicate user that only photo inside this part (in camera) will be saved. It's like crop.

When I saved that capture image, it'll save zoomed photo [a big zoomed on photo],

enter image description here

What I found is, when I took a photo, it'll be of size of {2448, 3264}

I'm cropping the image like this,

- (UIImage *)imageByCroppingImage:(UIImage *)image toSize:(CGSize)size
{
    double x = (image.size.width - size.width) / 2.0;
    double y = (image.size.height - size.height) / 2.0;

    CGRect cropRect = CGRectMake(x, y, size.height, size.width);
    CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], cropRect);

    UIImage *cropped = [UIImage imageWithCGImage:imageRef];
    CGImageRelease(imageRef);

    return cropped;
}

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    UIImage *image = [info valueForKey:UIImagePickerControllerOriginalImage];

    if (image) {
        UIImage *newImage = [self imageByCroppingImage:image toSize:CGSizeMake(300.f, 300.f)];
        UIImageWriteToSavedPhotosAlbum(newImage, nil, nil, nil);
    }
}

Notes,

Any idea/suggestions?


Update:

From this question, Trying to crop my UIImage to a 1:1 aspect ratio (square) but it keeps enlarging the image causing it to be blurry. Why?

I followed the answer of @DrummerB like this,

    CGFloat originalWidth = image.size.width * image.scale;
    CGFloat originalHeight = image.size.height * image.scale;
    float smallestDimension = fminf(originalWidth, originalHeight);
    CGRect square = CGRectMake(0, 0, smallestDimension, smallestDimension);
    CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], square);
    UIImage *squareImage = [UIImage imageWithCGImage:imageRef scale:image.scale orientation:image.imageOrientation];
    UIImageWriteToSavedPhotosAlbum(squareImage, nil, nil, nil);
    CGImageRelease(imageRef);

This is what I captured,

enter image description here

And it result me the following,

enter image description here

Now I'm getting the square photo, but note in output, still I'm getting photo outside that yellow square. What I want is to get photo which is reside in yellow square. Captured image is still of size, {w=2448, h=3264}. Note, that red circles which indicate outer part of image which should not include in output as that part is not inside yellow square.

What's wrong in this?

1条回答
Deceive 欺骗
2楼-- · 2019-05-29 19:51

It looks like the image you are receiving in your implementation is returning an image crop of 300 by 300 pixels. The yellow square you have on screen is 300 by 300 points. Points are not the same as pixels. So if your photo 3264 pixels wide, then cropping it to 300 pixels would return an image of about 1/10th the original size.

查看更多
登录 后发表回答