How do I convert this OpenGL pointer math to Swift

2019-03-29 08:45发布

I’m following this tutorial about OpenGL / GLKit for iOS but trying to implement it in Swift. It’s going fine until I get to this part:

- (void)render { 

// 1
self.effect.texture2d0.name = self.textureInfo.name;
self.effect.texture2d0.enabled = YES;

// 2    
[self.effect prepareToDraw];

// 3
glEnableVertexAttribArray(GLKVertexAttribPosition);
glEnableVertexAttribArray(GLKVertexAttribTexCoord0);

//---------------- This is where things break down...
long offset = (long)&_quad;        
glVertexAttribPointer(GLKVertexAttribPosition, 2, GL_FLOAT, GL_FALSE, sizeof(TexturedVertex), (void *) (offset + offsetof(TexturedVertex, geometryVertex)));
glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, sizeof(TexturedVertex), (void *) (offset + offsetof(TexturedVertex, textureVertex)));

// 5    
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

}

@end

I have a property self.quad that’s a Swift struct like this:

struct TexturedVertex {
let geometryVertex: CGPoint
let textureVertex: CGPoint
}

struct TexturedQuad {

let bottomLeft: TexturedVertex
let bottomRight: TexturedVertex
let topLeft: TexturedVertex
let topRight: TexturedVertex
}

It’s a fairly straightforward structure, but I don’t know how to pass it to the last argument of glVertexAttribPointer. Any help would be stellar!

1条回答
该账号已被封号
2楼-- · 2019-03-29 09:37

In Swift, you can use the generic struct UnsafePointer to perform pointer arithmetic and casting. Swift doesn't have offsetof, but you can work around that cleanly by taking UnsafePointers of the geometryVertex and textureVertex elements directly.

The following code compiles in an iOS playground. I haven't tested it beyond that but I think it'll work:

import OpenGLES
import GLKit

struct TexturedVertex {
    var geometryVertex = GLKVector2()
    var textureVertex = GLKVector2()
}

struct TexturedQuad {
    var bl = TexturedVertex()
    var br = TexturedVertex()
    var tl = TexturedVertex()
    var tr = TexturedVertex()
    init() { }
}

var _quad = TexturedQuad()
withUnsafePointer(&_quad.bl.geometryVertex) { (pointer) -> Void in
    glVertexAttribPointer(GLuint(GLKVertexAttrib.Position.rawValue),
        2, GLenum(GL_FLOAT), GLboolean(GL_FALSE),
        GLsizei(sizeof(TexturedVertex)), pointer)
}
withUnsafePointer(&_quad.bl.textureVertex) { (pointer) -> Void in
    glVertexAttribPointer(GLuint(GLKVertexAttrib.TexCoord0.rawValue),
        2, GLenum(GL_FLOAT), GLboolean(GL_FALSE),
        GLsizei(sizeof(TexturedVertex)), pointer)
}

By the way, using CGPoint the way you did in your question is dangerous, because CGFloat changes size depending on your target (32-bit or 64-bit), but GL_FLOAT always means 32-bit. The tutorial you're following was written before 64-bit iOS came out.

查看更多
登录 后发表回答