Memory Leak in CMSampleBufferGetImageBuffer

2019-02-19 15:05发布

I'm getting a UIImage from a CMSampleBufferRef video buffer every N video frames like:

- (void)imageFromVideoBuffer:(void(^)(UIImage* image))completion {
    CMSampleBufferRef sampleBuffer = _myLastSampleBuffer;
    if (sampleBuffer != nil) {
        CFRetain(sampleBuffer);
        CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)];
        _lastAppendedVideoBuffer.sampleBuffer = nil;
        if (_context == nil) {
            _context = [CIContext contextWithOptions:nil];
        }
        CVPixelBufferRef buffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        CGImageRef cgImage = [_context createCGImage:ciImage fromRect:
                              CGRectMake(0, 0, CVPixelBufferGetWidth(buffer), CVPixelBufferGetHeight(buffer))];
        __block UIImage *image = [UIImage imageWithCGImage:cgImage];

        CGImageRelease(cgImage);
        CFRelease(sampleBuffer);

        if(completion) completion(image);

        return;
    }
    if(completion) completion(nil);
}

XCode and Instruments detect a Memory Leak, but I'm not able to get rid of it. I'm releasing the CGImageRef and CMSampleBufferRef as usual:

CGImageRelease(cgImage);
CFRelease(sampleBuffer);

[UPDATE] I put in the AVCapture output callback to get the sampleBuffer.

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    if (captureOutput == _videoOutput) {
        _lastVideoBuffer.sampleBuffer = sampleBuffer;
        id<CIImageRenderer> imageRenderer = _CIImageRenderer;

        dispatch_async(dispatch_get_main_queue(), ^{
            @autoreleasepool {
                CIImage *ciImage = nil;
                ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)];
                if(_context==nil) {
                    _context = [CIContext contextWithOptions:nil];
                }
                CGImageRef processedCGImage = [_context createCGImage:ciImage
                                                             fromRect:[ciImage extent]];
                //UIImage *image=[UIImage imageWithCGImage:processedCGImage];
                CGImageRelease(processedCGImage);
                NSLog(@"Captured image %@", ciImage);
            }
        });

The code that leaks is the createCGImage:ciImage:

CGImageRef processedCGImage = [_context createCGImage:ciImage
                                                             fromRect:[ciImage extent]];

even having a autoreleasepool, the CGImageRelease of the CGImage reference and a CIContext as instance property.

This seems to be the same issue addressed here: Can't save CIImage to file on iOS without memory leaks

[UPDATE] The leak seems to be due a bug. The issue is well described in Memory leak on CIContext createCGImage at iOS 9?

A sample project shows how to reproduce this leak: http://www.osamu.co.jp/DataArea/VideoCameraTest.zip

The last comments assure that

It looks like they fixed this in 9.1b3. If anyone needs a workaround that works on iOS 9.0.x, I was able to get it working with this:

in a test code (Swift in this case):

[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)  
    {  
        if (error) return;  

        __block NSString *filePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"ipdf_pic_%i.jpeg",(int)[NSDate date].timeIntervalSince1970]];  

        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];  
        dispatch_async(dispatch_get_main_queue(), ^  
        {  

            @autoreleasepool  
            {  
                CIImage *enhancedImage = [CIImage imageWithData:imageData];  

                if (!enhancedImage) return;  

                static CIContext *ctx = nil; if (!ctx) ctx = [CIContext contextWithOptions:nil];  

                CGImageRef imageRef = [ctx createCGImage:enhancedImage fromRect:enhancedImage.extent format:kCIFormatBGRA8 colorSpace:nil];  

                UIImage *image = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:UIImageOrientationRight];  

                [[NSFileManager defaultManager] createFileAtPath:filePath contents:UIImageJPEGRepresentation(image, 0.8) attributes:nil];  

                CGImageRelease(imageRef);  
            }  
        });  
    }]; 

and the workaround for iOS9.0 should be

extension CIContext {  
    func createCGImage_(image:CIImage, fromRect:CGRect) -> CGImage {  
        let width = Int(fromRect.width)  
        let height = Int(fromRect.height)  

        let rawData =  UnsafeMutablePointer<UInt8>.alloc(width * height * 4)  
        render(image, toBitmap: rawData, rowBytes: width * 4, bounds: fromRect, format: kCIFormatRGBA8, colorSpace: CGColorSpaceCreateDeviceRGB())  
        let dataProvider = CGDataProviderCreateWithData(nil, rawData, height * width * 4) {info, data, size in UnsafeMutablePointer<UInt8>(data).dealloc(size)}  
        return CGImageCreate(width, height, 8, 32, width * 4, CGColorSpaceCreateDeviceRGB(), CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedLast.rawValue), dataProvider, nil, false, .RenderingIntentDefault)!  
    }  
}  

2条回答
走好不送
2楼-- · 2019-02-19 15:38

I can confirm that this memory leak still exists on iOS 9.2. (I've also posted on the Apple Developer Forum.)

I get the same memory leak on iOS 9.2. I've tested dropping EAGLContext by using MetalKit and MLKDevice. I've tested using different methods of CIContext like drawImage, createCGImage and render but nothing seem to work.

It is very clear that this is a bug as of iOS 9. Try it out your self by downloading the example app from Apple (see below) and then run the same project on a device with iOS 8.4, then on a device with iOS 9.2 and pay attention to the memory gauge in Xcode.

Download https://developer.apple.com/library/ios/samplecode/AVBasicVideoOutput/Introduction/Intro.html#//apple_ref/doc/uid/DTS40013109

Add this to the APLEAGLView.h:20

@property (strong, nonatomic) CIContext* ciContext;

Replace APLEAGLView.m:118 with this

[EAGLContext setCurrentContext:_context];
 _ciContext = [CIContext contextWithEAGLContext:_context];

And finaly replace APLEAGLView.m:341-343 with this

glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);  

    @autoreleasepool  
    {  
        CIImage* sourceImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];  
        CIFilter* filter = [CIFilter filterWithName:@"CIGaussianBlur" keysAndValues:kCIInputImageKey, sourceImage, nil];  
        CIImage* filteredImage = filter.outputImage;  

        [_ciContext render:filteredImage toCVPixelBuffer:pixelBuffer];  
    }  

glBindRenderbuffer(GL_RENDERBUFFER, _colorBufferHandle);  
查看更多
祖国的老花朵
3楼-- · 2019-02-19 15:53

We were experiencing a similar issue in an app we created, where we are processing each frame for feature keypoints with OpenCV, and sending off a frame every couple of seconds. After a while of running we would end up with quite a few memory pressure messages.

We managed to rectify this by running our processing code in it's own auto release pool like so (jpegDataFromSampleBufferAndCrop does something similar to what you are doing, with added cropping):

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
        @autoreleasepool {

            if ([self.lastFrameSentAt timeIntervalSinceNow] < -kContinuousRateInSeconds) {

                NSData *imageData = [self jpegDataFromSampleBufferAndCrop:sampleBuffer];

                if (imageData) {
                    [self processImageData:imageData];
                }

                self.lastFrameSentAt = [NSDate date];

                imageData = nil;
            }
        }
    }
}
查看更多
登录 后发表回答