How to record video with ARCore with Unity?

2019-06-19 12:26发布

问题:

I have been stuck on this problem for over a month now. I just need to record the video feed when people are using the AR app.

There are several options:

1. Take the screenshot in Unity for every frame.
I tried taking a screenshot every frame. This is way too slow. The fps is only 5.
Then I tried saving the texture to an array and encode them to images afterwards.
This would take a lot of memory and would cause a significant frame drop on mobile phone. The fps is around 10.
If anyone has a great idea for this method, please let me know.

2. Use native plugins to record video.
I haven't found any solutions on this one. I am afraid that this may conflict with the ARCore.

I know that there is an Android solution but ideally I want to use Unity. Any help is appreciated, thank you!

3. Save the texture from the texture reader api provided by ARCore computer vision example.
There is a Computer Vision example in the directory. And I can get the texture directly from GPU with its api.

However, the fps is still low. With its edge detector example, the fps is around 15. I succeeded in saving those frames to local directory in another thread. But the fps is still not acceptable. The bottomline is 720p at 30fps.

PS: I just need to save the frames. I can handle encoding them into videos.

PPS: Just recording the camera feed and recording the camera feed and the augmented objects together are both okay. Either one achieved is great.

回答1:

You can easily implement video recording AND sharing using the (really great) NatCorder unity asset (asset store link) and the related NatShare API. I did this very same thing in my own ARCore experiment/"game."

Edit: you may have to implement this workaround to get a smooth framerate.