I'm writing an image editor for an iPhone App. I use GPUImage library to present an image (on GPUImageView) of the sprite on the canvas. And I use the following method to get the screenshot of the canvas:
@implementation UIView (SnapshotImage)
// Return a snapshot image of this view
- (UIImage*)snapshot{
UIGraphicsBeginImageContextWithOptions(self.bounds.size,YES,0.0f);
CGContextRef context = UIGraphicsGetCurrentContext();
// In iOS 7, you can use the following instead of `-renderInContext:`. It should be faster.
// [self drawViewHierarchyInRect:self.bounds afterScreenUpdates:YES];
[self.layer renderInContext:context];
UIImage *capturedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return capturedImage;
}
@end
After I used this method to get the whole screenshot of key window, the field of the canvas was empty (no image of any sprite).
UIView *keyWindow = [[UIApplication sharedApplication] keyWindow];
UIImage *snapshotIamge = [keyWindow snapshot];
// Present the image on the view...
I think there is a problem to get a screenshot of a view containing GPUImageView(s), which use OpenGLES to render images not Quartz. How to get a screenshot of a view containing GPUImageView?
And is it possible to get a cropped image with a specific CGRect when taking a screenshot (like cmd + shift + 4
on Mac)?
As you found,
-renderInContext:
doesn't work with OpenGL ES content. What you're going to need is to extract the processed image from GPUImage itself.To do this, call
-imageFromCurrentlyProcessedOutput
on the filter that feeds directly into your GPUImageView. This will give you a UIImage with the final filtered image within it. You can then either use this directly, or you can composite it with the remaining content from your surrounding views.