I’m following this tutorial about OpenGL / GLKit for iOS but trying to implement it in Swift. It’s going fine until I get to this part:
- (void)render {
// 1
self.effect.texture2d0.name = self.textureInfo.name;
self.effect.texture2d0.enabled = YES;
// 2
[self.effect prepareToDraw];
// 3
glEnableVertexAttribArray(GLKVertexAttribPosition);
glEnableVertexAttribArray(GLKVertexAttribTexCoord0);
//---------------- This is where things break down...
long offset = (long)&_quad;
glVertexAttribPointer(GLKVertexAttribPosition, 2, GL_FLOAT, GL_FALSE, sizeof(TexturedVertex), (void *) (offset + offsetof(TexturedVertex, geometryVertex)));
glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, sizeof(TexturedVertex), (void *) (offset + offsetof(TexturedVertex, textureVertex)));
// 5
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
}
@end
I have a property self.quad
that’s a Swift struct like this:
struct TexturedVertex {
let geometryVertex: CGPoint
let textureVertex: CGPoint
}
struct TexturedQuad {
let bottomLeft: TexturedVertex
let bottomRight: TexturedVertex
let topLeft: TexturedVertex
let topRight: TexturedVertex
}
It’s a fairly straightforward structure, but I don’t know how to pass it to the last argument of glVertexAttribPointer
. Any help would be stellar!
In Swift, you can use the generic struct
UnsafePointer
to perform pointer arithmetic and casting. Swift doesn't haveoffsetof
, but you can work around that cleanly by takingUnsafePointer
s of thegeometryVertex
andtextureVertex
elements directly.The following code compiles in an iOS playground. I haven't tested it beyond that but I think it'll work:
By the way, using
CGPoint
the way you did in your question is dangerous, becauseCGFloat
changes size depending on your target (32-bit or 64-bit), butGL_FLOAT
always means 32-bit. The tutorial you're following was written before 64-bit iOS came out.