iPhone : How to get colour of each pixel of an ima

2019-06-10 06:45发布

I want to get color of all the individual pixels of an image. To elaborate Let say I have an image named "SampleImage" having 400 x 400 pixels Basically I want to create a grid from 'SampleImage' which will have 400 x 400 squares each filled with color corresponding to the specific pixel in 'SampleImage'.

I know this is a little abstract, but I am novice in iOS and don't know where to start from. Thanks in advance!

3条回答
看我几分像从前
2楼-- · 2019-06-10 07:03

If you are a novice, you should consider doing something easier first. Anyway, what you need to do is set up a CGContextRef via CGBitmapContextCreate with enough data to hold your image. Once you create it, you need to render your image into it via CGDrawImage. After that you will have a pointer to every pixel in your image. The code is similar to Nishant's answer, but instead of 1x1, you will use 400x400 to get all of the pixels at once.

查看更多
Summer. ? 凉城
3楼-- · 2019-06-10 07:06

Use this : Here is more efficient solution:

// UIView+ColorOfPoint.h
@interface UIView (ColorOfPoint)
- (UIColor *) colorOfPoint:(CGPoint)point;
@end

// UIView+ColorOfPoint.m
#import "UIView+ColorOfPoint.h"
#import <QuartzCore/QuartzCore.h>

@implementation UIView (ColorOfPoint)

- (UIColor *) colorOfPoint:(CGPoint)point
{
    unsigned char pixel[4] = {0};
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pixel, 1, 1, 8, 4, colorSpace, kCGImageAlphaPremultipliedLast);

    CGContextTranslateCTM(context, -point.x, -point.y);

    [self.layer renderInContext:context];

    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    //NSLog(@"pixel: %d %d %d %d", pixel[0], pixel[1], pixel[2], pixel[3]);

    UIColor *color = [UIColor colorWithRed:pixel[0]/255.0 green:pixel[1]/255.0 blue:pixel[2]/255.0 alpha:pixel[3]/255.0];

    return color;
}

@end

Hope it helps you.

查看更多
劳资没心,怎么记你
4楼-- · 2019-06-10 07:24

This code worked flawlessly for me -:

    - (NSArray*)getRGBAsFromImage:(UIImage*)image atX:(int)xx andY:(int)yy count:(int)count{
    NSMutableArray *result = [NSMutableArray arrayWithCapacity:count];

    // First get the image into your data buffer
    CGImageRef imageRef = [image CGImage];
    NSUInteger width = CGImageGetWidth(imageRef);
    NSUInteger height = CGImageGetHeight(imageRef);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    unsigned char *rawData = (unsigned char*) calloc(height * width * 4, sizeof(unsigned char));
    NSUInteger bytesPerPixel = 4;
    NSUInteger bytesPerRow = bytesPerPixel * width;
    NSUInteger bitsPerComponent = 8;
    CGContextRef context = CGBitmapContextCreate(rawData, width, height,
                                                 bitsPerComponent, bytesPerRow, colorSpace,
                                                 kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
    CGColorSpaceRelease(colorSpace);

    CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
    CGContextRelease(context);

    // Now your rawData contains the image data in the RGBA8888 pixel format.
    int byteIndex = (bytesPerRow * yy) + xx * bytesPerPixel;
    for (int ii = 0 ; ii < count ; ++ii)
    {
        CGFloat red   = (rawData[byteIndex]     * 1.0) / 255.0;
        CGFloat green = (rawData[byteIndex + 1] * 1.0) / 255.0;
        CGFloat blue  = (rawData[byteIndex + 2] * 1.0) / 255.0;
        CGFloat alpha = (rawData[byteIndex + 3] * 1.0) / 255.0;
        byteIndex += 4;

        UIColor *acolor = [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
        [result addObject:acolor];
    }

    free(rawData);
    return result;
}
查看更多
登录 后发表回答