I'm using a ARSCNView from ARKit to display live a video feed from the camera on the iPad. I have the ARSCNView object setup exactly as Xcode's Augmented Reality App template. I was wondering if there is a way to get the field of view of the camera?
@IBOutlet var sceneView: ARSCNView!
func start() {
sceneView.delegate = self
sceneView.session.run(ARWorldTrackingConfiguration())
// Retrieve camera FOV here
}
Or you could just do:
There are a couple of ways to go here, and a possible false start to beware of.
⚠️ ARKit + SceneKit (incorrect)
If you're already working with ARKit via SceneKit (
ARSCNView
), you might assume that ARKit is automatically updating the SceneKit camera (the view'spointOfView
'scamera
) to match the projection transform used by ARKit. This is correct.However, ARKit directly sets the
SCNCamera
'sprojectionTransform
. When you work with geometric properties ofSCNCamera
likezNear
andzFar
andfieldOfView
, SceneKit derives a projection matrix for use in rendering. But if you setprojectionTransform
directly, there's no math that can recover the near/far and xFov/yFov values, so the correspondingSCNCamera
properties are invalid. That is,sceneView.pointOfView.camera.fieldOfView
and similar APIs always return bogus results for an ARKit app.So, what can you do instead? Read on...
Projection Matrix
An AR session continually vends
ARFrame
objects through its delegate, or you can request thecurrentFrame
from it. Each frame has anARCamera
attached that describes the imaging parameters, one of which is aprojectionMatrix
that's dependent on field of view. (There's also the aforementioned SceneKitprojectionTransform
, which is the same matrix.)A standard 3D projection matrix includes scaling terms that are based on the vertical field of view and aspect ratio. Specifically, the matrix looks like this:
So you should be able to get
yFov
by solving theyScale
equation:And for horizontal field of view, you can multiply by the aspect ratio (specifically, the width/height ratio):
If you look closely, though, you might notice that the aspect ratio between
xFov
/yFov
here (and the aspect ratio ofimageResolution
) don't necessarily match that of your device screen (especially on iPhone X) or the view you're drawing AR content in. That's because you've measured the FOV angles of the camera image, not those of your app's AR view. Don't worry, there's an API for that, too...Projection Matrix with Viewport
ARCamera
offers two APIs for getting a projection matrix. Besides the one we just went over, there's alsoprojectionMatrix(for:viewportSize:zNear:zFar:)
, which takes presentation into account. If you want to match not the FOV of the camera, but the FOV of howARSCNView
orARSKView
(or Unity or Unreal, probably?) render your AR scene, use this, passing the device orientation and see of your view. Then do all the same math as above:What you pass for
zNear
andzFar
doesn't matter, since we're not using the parts of the matrix that depend on that. (You might still need to ensurezNear < zFar
andzNear != zFar != 0
.)Camera Intrinsics
Keen observers may have noticed that the above calculations ignore parts of the projection matrix. That's because the definition of FOV angle is an optical property of the camera, not anything to do with 3D projection, so a whole projection matrix is an intermediate result you might not really need.
ARCamera
also exposes anintrinsics
matrix that describes optical properties of the camera. The first and second values along the diagonal in this matrix are the horizontal and vertical focal length of a single pixel in the camera image. If you have focal length and image width/height, you can compute FOV per the definition of FOV angle: