Is there a way to render the contents of a UIView as a texture with OpenGL in iOS? Or the contents of a CGLayer?
相关问题
- CALayer - backgroundColor flipped?
- Core Data lightweight migration crashes after App
- How can I implement password recovery in an iPhone
- how do you prevent page scroll in textarea on mobi
- Custom UITableview cell accessibility not working
相关文章
- Could I create “Call” button in HTML 5 IPhone appl
- Unable to process app at this time due to a genera
- How do you detect key up / key down events from a
- “Storyboard.storyboard” could not be opened
- Open iOS 11 Files app via URL Scheme or some other
- Why is glClear blocking in OpenGLES?
- Can keyboard of type UIKeyboardTypeNamePhonePad be
- Can not export audiofiles via “open in:” from Voic
Here is another method of grabbing an OpenGL Layer, which is to use glReadPixels. This will grab the visible layers BEHIND your openGL layer, as well (basically, your visible screen). Check out this question: How do I grab an image form my EAGLLayer ?
Once you have your image, it must be resized to power of two. You can try and stretch the image, but that will cause quality issues when you shrink it again, or if you do it over and over again. The best way is draw your image normal size into a base-2 texture with extra pixels to make a buffer. Here is the code (I modified this from someone else's code, but I can't find the original code, so if someone sees the original code, please let me know where it came from to give credit):
Now, once that is done you use a texture coordinate array to ONLY pull the correct sized section from your image. Such as this:
And there you have it. A screenshot from your visible screen that is now a texture.
You can use the view's layer property to get the CALayer and use renderInContext: on that to draw into a CoreGraphics context. You can set up a CoreGraphics context with memory you allocate yourself in order to receive a pixel buffer. You can then upload that to OpenGL by the normal method.
So: there's a means to get the pixel contents of a UIView and OpenGL will accept pixel buffers. There's no specific link between the two.
Coding extemporaneously, the process would be something like:
That doesn't deal with issues surrounding non-power-of-two sized views on hardware without the non-power-of-two texture extension and assumes a suitable GL texture name has already been generated and bound. Check for yourself, but I think non-power-of-two is supported on SGX hardware (ie, iPhone 3GS onwards, the iPad and all but the 8gb third generation iPod Touch onwards) but not on MBX.
The easiest way to deal with non-power-of-two textures here is probably to create a large enough power of two texture and to use glTexSubImage2D to upload just the portion from your source UIView.