How to convert bytearray to image in Objective-C

2020-06-26 08:03发布

问题:

This is the Case
i have an unsigned char pointer to BMP image data
after i loop with the pointer i achieved a byte array contain int values 0 - 255

What i want is convert this values in the array to BMP image to display it in UIImage.

**the image is in gray scale

回答1:

This code snippet is from this blog, I recommend you take a look there and the project site in Github

Also Note that this class method works for RGB8 Images, so you will need to make changes in bitsPerPixel (it should be 8 for grayscale), bytesPerRow (1 * width), bufferLength (remove * 4) and create colorSpaceRef using CGColorCreateGenericGray instead.

Also I've noticed that creating the color space with CGColorCreateGenericGray assumes that your array has alpha info. So maybe you should add an alpha byte for each pixel to make it work properly.

+ (UIImage *) convertBitmapRGBA8ToUIImage:(unsigned char *) buffer 
            withWidth:(int) width
           withHeight:(int) height {


    size_t bufferLength = width * height * 4;
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer, bufferLength, NULL);
    size_t bitsPerComponent = 8;
    size_t bitsPerPixel = 32;
    size_t bytesPerRow = 4 * width;

    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    if(colorSpaceRef == NULL) {
        NSLog(@"Error allocating color space");
        CGDataProviderRelease(provider);
        return nil;
    }

    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault | kCGImageAlphaPremultipliedLast;
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;

    CGImageRef iref = CGImageCreate(width, 
                height, 
                bitsPerComponent, 
                bitsPerPixel, 
                bytesPerRow, 
                colorSpaceRef, 
                bitmapInfo, 
                provider,   // data provider
                NULL,       // decode
                YES,            // should interpolate
                renderingIntent);

    uint32_t* pixels = (uint32_t*)malloc(bufferLength);

    if(pixels == NULL) {
        NSLog(@"Error: Memory not allocated for bitmap");
        CGDataProviderRelease(provider);
        CGColorSpaceRelease(colorSpaceRef);
        CGImageRelease(iref);       
        return nil;
    }

    CGContextRef context = CGBitmapContextCreate(pixels, 
                 width, 
                 height, 
                 bitsPerComponent, 
                 bytesPerRow, 
                 colorSpaceRef, 
                 bitmapInfo); 

    if(context == NULL) {
        NSLog(@"Error context not created");
        free(pixels);
    }

    UIImage *image = nil;
    if(context) {

        CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, width, height), iref);

        CGImageRef imageRef = CGBitmapContextCreateImage(context);

        // Support both iPad 3.2 and iPhone 4 Retina displays with the correct scale
        if([UIImage respondsToSelector:@selector(imageWithCGImage:scale:orientation:)]) {
            float scale = [[UIScreen mainScreen] scale];
            image = [UIImage imageWithCGImage:imageRef scale:scale orientation:UIImageOrientationUp];
        } else {
            image = [UIImage imageWithCGImage:imageRef];
        }

        CGImageRelease(imageRef);   
        CGContextRelease(context);  
    }

    CGColorSpaceRelease(colorSpaceRef);
    CGImageRelease(iref);
    CGDataProviderRelease(provider);

    if(pixels) {
        free(pixels);
    }   
    return image;
}


回答2:

I would imagine something like this. Untested, so be ready to tweak :)

UIImage *yourImage = [UIImage imageWithData: [NSData dataWithBytes: yourCharPointer length : sizeof(yourCharPointer)]];


回答3:

Because of the accepted answer is to long, I've wrote another solution:

- (UIImage*) imageFromArray:(const char*)pixelArray width:(int)width height:(int)height {

    int imageSizeInPixels = width * height;
    int bytesPerPixel = 2; // 1 byte for brightness, 1 byte for alpha
    unsigned char *pixels = (unsigned char *)malloc(imageSizeInPixels * bytesPerPixel);
    memset(pixels, 255, imageSizeInPixels * bytesPerPixel); // setting alpha values to 255
    for (int i = 0; i < imageSizeInPixels; i++) {
        pixels[i * 2] = pixelArray[i]; // writing array of bytes as image brightnesses
    }

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, pixels, imageSizeInPixels * bytesPerPixel, NULL);
    CGImageRef cgImage = CGImageCreate(width, height, 8, 8 * bytesPerPixel, width * bytesPerPixel, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big, provider, NULL, false, kCGRenderingIntentDefault);
    UIImage *image = [UIImage imageWithCGImage:cgImage];

    return image;
}