Using unity, and the new 1.1 version of ARCore, the API exposes some new ways of getting the camera information. However, I can't find any good examples of saving this as a file to local storage as a jpg, for example.
The ARCore examples have a nice example of retrieving the camera data and then doing something with it here: https://github.com/google-ar/arcore-unity-sdk/blob/master/Assets/GoogleARCore/Examples/ComputerVision/Scripts/ComputerVisionController.cs#L212 and there are a few examples of retrieving the camera data in that class, but nothing around saving that data.
I've seen this: How to take & save picture / screenshot using Unity ARCore SDK? which uses the older API way of getting data, and doesn't really go into detail on saving, either.
What I ideally want is a way to turn the data from Frame.CameraImage.AcquireCameraImageBytes()
in the API into a stored jpg on disk, through Unity.
Update
I've since got it working mainly through digging through this issue on the ARCore github page: https://github.com/google-ar/arcore-unity-sdk/issues/72#issuecomment-355134812 and modifying Sonny's answer below, so it's only fair that one gets accepted.
In case anyone else is trying to do this I had to do the following steps:
Add a callback to the Start method to run your
OnImageAvailable
method when the image is available:public void Start() { TextureReaderComponent.OnImageAvailableCallback += OnImageAvailable; }
Add a TextureReader (from the computer vision example provided with the SDK) to your camera and your script
Your
OnImageAvailable
should look a bit like this:/// <summary> /// Handles a new CPU image. /// </summary> /// <param name="format">The format of the image.</param> /// <param name="width">Width of the image, in pixels.</param> /// <param name="height">Height of the image, in pixels.</param> /// <param name="pixelBuffer">Pointer to raw image buffer.</param> /// <param name="bufferSize">The size of the image buffer, in bytes.</param> private void OnImageAvailable(TextureReaderApi.ImageFormatType format, int width, int height, IntPtr pixelBuffer, int bufferSize) { if (m_TextureToRender == null || m_EdgeImage == null || m_ImageWidth != width || m_ImageHeight != height) { m_TextureToRender = new Texture2D(width, height, TextureFormat.RGBA32, false, false); m_EdgeImage = new byte[width * height * 4]; m_ImageWidth = width; m_ImageHeight = height; } System.Runtime.InteropServices.Marshal.Copy(pixelBuffer, m_EdgeImage, 0, bufferSize); // Update the rendering texture with the sampled image. m_TextureToRender.LoadRawTextureData(m_EdgeImage); m_TextureToRender.Apply(); var encodedJpg = m_TextureToRender.EncodeToJPG(); var path = Application.persistentDataPath; File.WriteAllBytes(path + "/test.jpg", encodedJpg); }
Since I'm not familiar with ARCore, I shall keep this generic.
Texture2D
usingLoadRawTextureData()
andApply()
EncodeToJPG()
File.WriteAllBytes(path + ".jpg", encodedBytes)
In Unity, it should be possible to load the raw image data into a texture and then save it to a JPG using UnityEngine.ImageConversion.EncodeToJPG. Example code:
However, I'm not sure if the TextureFormat corresponds to a format that works with
Frame.CameraImage.AcquireCameraImageBytes()
. (I'm familiar with Unity but not ARCore.) See Unity's documentation on TextureFormat, and whether that is compatible with ARCore'sImageFormatType
.Also, test whether the code is performant enough for your application.
EDIT: As user @Lece explains, save the encoded data with
File.WriteAllBytes
. I've updated my code example above as I omitted that step originally.EDIT #2: For the complete answer specific to ARCore, see the update to the question post. The comments here may also be useful - Jordan specified that "the main part was to use the texture reader from the computer vision sdk example here".