We are applying a 'CIGaussianBlur' filter on few images. The process is working fine most of the time. But when the app moves to the background the process produce whit stripes on the image. (Images below, notice that the left and bottom of the image is striped to white and that the image is shrieked a bit in comparing to the original image).
The Code:
- (UIImage*)imageWithBlurRadius:(CGFloat)radius
{
UIImage *image = self;
LOG(@"(1) image size before resize = %@",NSStringFromCGSize(image.size));
NSData *imageData = UIImageJPEGRepresentation(self, 1.0);
LOG(@"(2) image data length = %ul",imageData.length);
//create our blurred image
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *inputImage = [CIImage imageWithCGImage:image.CGImage];
//setting up Gaussian Blur (we could use one of many filters offered by Core Image)
CIFilter *filter = [CIFilter filterWithName:@"CIGaussianBlur"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:radius] forKey:@"inputRadius"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
//CIGaussianBlur has a tendency to shrink the image a little, this ensures it matches up exactly to the bounds of our original image
CGImageRef cgImage = [context createCGImage:result fromRect:[inputImage extent]];
UIImage *finalImage = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
LOG(@"(3) final image size after resize = %@",NSStringFromCGSize(finalImage.size));
return finalImage;
}
Before Filter )
After Filter
Actually, I just faced this exact problem, and found a solution that's different than what @RizwanSattar describes.
What I do, based on an exchange with "Rincewind" on the Apple developer boards, is to first apply a CIAffineClamp on the image, with the transform value set to identity. This creates an image at the same scale, but with an infinite extent. That causes the blur to blur the edges correctly.
Then after I apply the blur, I crop the image to it's original extent, cropping away the feathering that takes place on the edges.
You can see the code in a CI Filter demo app I've posted on github:
CIFilter demo project on github
It's a general-purpose program that handles all the different CI filters, but it has code to deal with the Gaussian blur filter.
Take a look at the method
showImage
. It has special-case code to set the extent on the source image before applying the blur filter:(Where the method "clampFilter" just lazily loads a
CIAffineClamp
filter.)Then I apply the user-selected filter:
Then after applying the selected filter, I then check the extent of the resulting image and crop it back to the original extent if it's bigger:
The reason you are seeing those "white strips" in the blurred image is that the resulting CIImage is bigger than you original image, because it has the fuzzy edges of the blur. When you are hard-cropping the resulting image to be the same size as your original image, it's not accounting for the fuzzy edges.
After:
CIImage *result = [filter valueForKey:kCIOutputImageKey];
Take a look at
result.extent
which is a CGRect that shows you the new bounding box relative to the original image. (i.e. for positive radii,result.extent.origin.y
would be negative)Here's some code (you should really test it):
Hope that helps.