I am using the iPhone X and ARFaceKit
to capture the user's face. The goal is to texture the face mesh with the user's image.
I'm only looking at a single frame (an ARFrame
) from the AR
session.
From ARFaceGeometry
, I have a set of vertices that describe the face.
I make a jpeg representation of the current frame's capturedImage
.
I then want to find the texture coordinates that map the created jpeg onto the mesh vertices. I want to: 1. map the vertices from model space to world space; 2. map the vertices from world space to camera space; 3. divide by image dimensions to get pixel coordinates for the texture.
let geometry: ARFaceGeometry = contentUpdater.faceGeometry!
let theCamera = session.currentFrame?.camera
let theFaceAnchor:SCNNode = contentUpdater.faceNode
let anchorTransform = float4x4((theFaceAnchor?.transform)!)
for index in 0..<totalVertices {
let vertex = geometry.vertices[index]
// Step 1: Model space to world space, using the anchor's transform
let vertex4 = float4(vertex.x, vertex.y, vertex.z, 1.0)
let worldSpace = anchorTransform * vertex4
// Step 2: World space to camera space
let world3 = float3(worldSpace.x, worldSpace.y, worldSpace.z)
let projectedPt = theCamera?.projectPoint(world3, orientation: .landscapeRight, viewportSize: (theCamera?.imageResolution)!)
// Step 3: Divide by image width/height to get pixel coordinates
if (projectedPt != nil) {
let vtx = projectedPt!.x / (theCamera?.imageResolution.width)!
let vty = projectedPt!.y / (theCamera?.imageResolution.height)!
textureVs += "vt \(vtx) \(vty)\n"
}
}
This is not working, but instead gets me a very funky looking face! Where am I going wrong?
ARSCNFaceGeometry
is a SceneKit's representation of face topology for use with face information provided by anARSession
. It's used for a quick visualization of face geometry using SceneKit's rendering engine.ARSCNFaceGeometry
class is a subclass ofSCNGeometry
that wraps the mesh data provided by theARFaceGeometry
class. You can useARSCNFaceGeometry
to quickly and easily visualize face topology and facial expressions provided by ARKit in a SceneKit view.But
ARSCNFaceGeometry
is available only in SceneKit views or renderers that use Metal. This class is not supported for OpenGL-based SceneKit rendering.The start point is different:
Apply the following changes to your code:
You can get Normal Face.
Texturing the face mesh with the user's image is now available in the Face-Based sample code published by Apple (section Map Camera Video onto 3D Face Geometry).
One can map camera video onto 3D Face Geometry using this following shader modifier.