I see that sometimes NSImage size is not real size (with some pictures) and CIImage size is always real. I was testing with this image.
This is source code which I wrote for testing:
NSImage *_imageNSImage = [[NSImage alloc]initWithContentsOfFile:@"<path to image>"];
NSSize _dimensions = [_imageNSImage size];
[_imageNSImage release];
NSLog(@"Width from CIImage: %f",_dimensions.width);
NSLog(@"Height from CIImage: %f",_dimensions.height);
NSURL *_myURL = [NSURL fileURLWithPath:@"<path to image>"];
CIImage *_imageCIImage = [CIImage imageWithContentsOfURL:_myURL];
NSRect _rectFromCIImage = [_imageCIImage extent];
NSLog(@"Width from CIImage: %f",_rectFromCIImage.size.width);
NSLog(@"Height from CIImage: %f",_rectFromCIImage.size.height);
And output is:
So how that can be?? Maybe I'm doing something wrong?
NSImage
size
method returns size information that is screen resolution dependent. To get the size represented in the actual file image you need to use an NSImageRep
. You can get an NSImageRep
from an NSImage
using the representations
method. Alternatively you can create a NSBitmapImageRep
subclass instance directly like this:
NSArray * imageReps = [NSBitmapImageRep imageRepsWithContentsOfFile:@"<path to image>"];
NSInteger width = 0;
NSInteger height = 0;
for (NSImageRep * imageRep in imageReps) {
if ([imageRep pixelsWide] > width) width = [imageRep pixelsWide];
if ([imageRep pixelsHigh] > height) height = [imageRep pixelsHigh];
}
NSLog(@"Width from NSBitmapImageRep: %f",(CGFloat)width);
NSLog(@"Height from NSBitmapImageRep: %f",(CGFloat)height);
The loop takes into account that some image formats may contain more than a single image (such as TIFFs for example).
You can create an NSImage at this size by using the following:
NSImage * imageNSImage = [[NSImage alloc] initWithSize:NSMakeSize((CGFloat)width, (CGFloat)height)];
[imageNSImage addRepresentations:imageReps];
NSImage size method return size in points. To get size represented in pixels you need inspect NSImage.representations property that contains an array of NSImageRep objects with pixelWide/pixelHigh properties and simple change size NSImage object:
@implementation ViewController {
__weak IBOutlet NSImageView *imageView;
}
- (void)viewDidLoad {
[super viewDidLoad];
// Do view setup here.
NSImage *image = [[NSImage alloc] initWithContentsOfFile:@"/Users/username/test.jpg"];
if (image.representations && image.representations.count > 0) {
long lastSquare = 0, curSquare;
NSImageRep *imageRep;
for (imageRep in image.representations) {
curSquare = imageRep.pixelsWide * imageRep.pixelsHigh;
if (curSquare > lastSquare) {
image.size = NSMakeSize(imageRep.pixelsWide, imageRep.pixelsHigh);
lastSquare = curSquare;
}
}
imageView.image = image;
NSLog(@"%.0fx%.0f", image.size.width, image.size.height);
}
}
@end
Thanks to Zenopolis for the original ObjC code, here's a nice concise Swift version:
func sizeForImageAtURL(url: NSURL) -> CGSize? {
guard let imageReps = NSBitmapImageRep.imageRepsWithContentsOfURL(url) else { return nil }
return imageReps.reduce(CGSize.zero, combine: { (size: CGSize, rep: NSImageRep) -> CGSize in
return CGSize(width: max(size.width, CGFloat(rep.pixelsWide)), height: max(size.height, CGFloat(rep.pixelsHigh)))
})
}
If your file contains only one image, you can just use this :
let rep = image.representations[0]
let imageSize = NSSize(width: rep.pixelsWide, height: rep.pixelsHigh)
image is your NSImage, imageSize is the image size in pixels.
Copied and updated here: https://stackoverflow.com/a/13228091/3608824