I'm developing ARKit
/Vision
iOS app with gesture recognition. My app has a simple UI containing single UIView
. There's no ARSCNView
/ARSKView
at all. I'm putting a sequence of captured ARFrames
into CVPixelBuffer
what then I use for VNRecognizedObjectObservation
.
I don't need any tracking data from a session. I just need currentFrame.capturedImage
for CVPixelBuffer
. And I need to capture ARFrames at 30 fps. 60 fps is excessive frame rate.
preferredFramesPerSecond
instance property is absolutely useless in my case, because it controls frame rate for rendering anARSCNView
/ARSKView
. I have no ARViews. And it doesn't affect session's frame rate.
So, I decided to use run()
and pause()
methods to decrease a session's frame rate.
Question
I'd like to know how to automatically run
and pause
an ARSession in a specified period of time? The duration of run
and pause
methods must be 16 ms
(or 0.016 sec). I suppose it might be possible through DispatchQueue
. But I don't know how to implement it.
How to do it?
Here's a pseudo-code:
session.run(configuration)
/* run lasts 16 ms */
session.pause()
/* pause lasts 16 ms */
session.run(session.configuration!)
/* etc... */
P.S. I can use neither CocoaPod nor Carthage in my app.
Update: It's about how ARSession's currentFrame.capturedImage
is retrieved and used.
let session = ARSession()
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
session.delegate = self
let configuration = ARImageTrackingConfiguration() // 6DOF
configuration.providesAudioData = false
configuration.isAutoFocusEnabled = true
configuration.isLightEstimationEnabled = false
configuration.maximumNumberOfTrackedImages = 0
session.run(configuration)
spawnCoreMLUpdate()
}
func spawnCoreMLUpdate() { // Spawning new async tasks
dispatchQueue.async {
self.spawnCoreMLUpdate()
self.updateCoreML()
}
}
func updateCoreML() {
let pixelBuffer: CVPixelBuffer? = (session.currentFrame?.capturedImage)
if pixelBuffer == nil { return }
let ciImage = CIImage(cvPixelBuffer: pixelBuffer!)
let imageRequestHandler = VNImageRequestHandler(ciImage: ciImage, options: [:])
do {
try imageRequestHandler.perform(self.visionRequests)
} catch {
print(error)
}
}