Decode images in background thread?

2020-02-08 08:53发布

问题:

I have a background thread that loads images and displays them in the main thread. I noticed that the background thread has almost nothing to do, because the actual image decoding seems to be done in the main thread:

So far I’ve tried calling [UIImage imageNamed:], [UIImage imageWithData:] and CGImageCreateWithJPEGDataProvider in the background thread with no difference. Is there a way to force the decoding to be done on the background thread?

There’s already a similar question here, but it does not help. As I wrote there, I tried the following trick:

@implementation UIImage (Loading)

- (void) forceLoad
{
    const CGImageRef cgImage = [self CGImage];  

    const int width = CGImageGetWidth(cgImage);
    const int height = CGImageGetHeight(cgImage);

    const CGColorSpaceRef colorspace = CGImageGetColorSpace(cgImage);
    const CGContextRef context = CGBitmapContextCreate(
        NULL, /* Where to store the data. NULL = don’t care */
        width, height, /* width & height */
        8, width * 4, /* bits per component, bytes per row */
        colorspace, kCGImageAlphaNoneSkipFirst);

    NSParameterAssert(context);
    CGContextDrawImage(context, CGRectMake(0, 0, width, height), cgImage);
    CGContextRelease(context);
}

@end

That works (forces the image to decode), but it also triggers an apparently expensive call to ImageIO_BGR_A_TO_RGB_A_8Bit.

回答1:

I ran into similar issues with hi-res images on the new retina iPad. Images larger than the screen size (roughly) would cause major problems with UI responsiveness. These were JPGs, so getting them to decode on the background seemed to be the right thing to do. I'm still working on tightening all of this up, but Tommy's solution worked great for me. I just wanted to contribute some code to help the next person along when they're trying to identify why their UI is stuttering with large images. Here's what I ended up doing (this code runs in an NSOperation on a background queue). The example is a blend of my code and the code above:

  CGDataProviderRef dataProvider = CGDataProviderCreateWithCFData((CFDataRef)self.data);
  CGImageRef newImage = CGImageCreateWithJPEGDataProvider(dataProvider,
                                    NULL, NO, 
                                    kCGRenderingIntentDefault);


  //////////
  // force DECODE

  const int width = CGImageGetWidth(newImage);
  const int height = CGImageGetHeight(newImage);

  const CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
  const CGContextRef context = CGBitmapContextCreate(
                                                     NULL, /* Where to store the data. NULL = don’t care */
                                                     width, height, /* width & height */
                                                     8, width * 4, /* bits per component, bytes per row */
                                                     colorspace, kCGImageAlphaNoneSkipFirst);

  NSParameterAssert(context);
  CGContextDrawImage(context, CGRectMake(0, 0, width, height), newImage);
  CGImageRef drawnImage = CGBitmapContextCreateImage(context);
  CGContextRelease(context);
  CGColorSpaceRelease(colorspace);

  //////////

  self.downloadedImage = [UIImage imageWithCGImage:drawnImage];

  CGDataProviderRelease(dataProvider);
  CGImageRelease(newImage);
  CGImageRelease(drawnImage);

I'm still optimizing this. But it seems to do pretty well so far.



回答2:

Officially speaking, UIKit is not thread safe (although it has become a little safer in iOS 4, I believe), so you shouldn't be calling either of the UIImage methods on a background thread. In practice I think you tend not to hit problems if you don't do anything that causes a UIKit object to want to redraw, but that's rule of thumb stuff that you should never rely on in shipping code — I mention it just to explain why you probably haven't had a problem.

As you've spotted, CGImageRefs (including when wrapped in a UIImage) continue to refer to source images in their encoded form unless and until someone needs them decoded. The 'trick' you cite is effective to trigger a decode; a better solution is to avoid UIImage completely until you're safely back on the main thread, start with a CGImageCreateWithJPEGDataProvider, create a separate context as you do above, CGContextDrawImage from the JPEGDataProvider to the new context and then pass the new context on as the image, releasing the JPEG data provider.

With respect to ImageIO_BGR_A_TO_RGB_A_8Bit, it may be worth giving CGColorSpaceCreateDeviceRGB() a go in place of CGImageGetColorSpace(cgImage), to get a context with a colour space that doesn't depend on the source image file underneath.