I use the following code to scale my UIImagePickerController
.
CGAffineTransform translate = CGAffineTransformMakeTranslation(0.0, 71.0); //This slots the preview exactly in the middle of the screen by moving it down 71 points
self.imagePicker.cameraViewTransform = translate;
CGAffineTransform scale = CGAffineTransformScale(translate, 1.333333, 1.333333);
self.imagePicker.cameraViewTransform = scale;
Then I take the picture and present it in a UIImageView
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissViewControllerAnimated:NO completion:NULL];
_imageTaken = nil;
_imageTaken = [info objectForKey:UIImagePickerControllerEditedImage];
if(_imageTaken==nil)
{
_imageTaken = [info objectForKey:UIImagePickerControllerOriginalImage];
}
if(_imageTaken==nil)
{
_imageTaken = [info objectForKey:UIImagePickerControllerCropRect];
}
[_imageTakenView setContentMode:UIViewContentModeScaleAspectFill];
if (_selfie) {
UIImage * flippedImage = [UIImage imageWithCGImage:_imageTaken.CGImage scale:_imageTaken.scale orientation:UIImageOrientationLeftMirrored];
_imageTaken = flippedImage;
}
_imageTakenView.image = _imageTaken;
}
Everything is good so far. Then I send the image up to my database by converting it to NSData
_imageData = [[NSData alloc] init];
_imageData = UIImageJPEGRepresentation(_image, .1);
When I load the data back from the server and present it in another UIImageView
of the same size I do set the same aspect ratio:
[_imageViewToShow setContentMode:UIViewContentModeScaleAspectFill];
but the image looks distored (squished horizontally). Any ideas as to why this might be?
Convert UIImage to NSData & Save
_imageData = UIImageJPEGRepresentation(_image, .1);
PFFile *imageFile = [PFFile fileWithName:@"image.png" data:_imageData];
newMessage[@"image"] = imageFile;
Problem:
When I redownload the data, the UIImage appears squished. When I look at the image on my database, it seems fine. Not sure why this is happening...
Before and After Images: