I have been working on webcam streaming for video and photo capture on Android devices within Unity3D. Most of the examples I have found in order to capture webcam feeds use a specific WebCamTexture object in order to get access to the devices camera hardware. I currently am able to capture the camera input but the WebCamTexture stores the data as a Color32[]. I found this solution below for converting a Color32[] to a byte[] but it seems to be swapping the red and blue color channels.
https://stackoverflow.com/a/21575147/8115962
Is there a way to prevent the red and blue channels from being reversed?
Here is another way to Convert Color32
array from WebCamTexture
to byte array:
First, create a structure to hold the converted array:
[StructLayout(LayoutKind.Explicit)]
public struct Color32Array
{
[FieldOffset(0)]
public byte[] byteArray;
[FieldOffset(0)]
public Color32[] colors;
}
The WebCamTexture
to convert:
WebCamTexture webcamTex = new WebCamTexture();
Create new instance of that structure:
Color32Array colorArray = new Color32Array();
Initialize Color32 with the appropriate size:
colorArray.colors = new Color32[webcamTex.width * webcamTex.height];
Fill Color32
which automatically fills byte array:
webcamTex.GetPixels32(colorArray.colors);
Now, you can use colorArray.byteArray
which is byte array.
Load into Texture 2D if needed:
Texture2D tex = new Texture2D(2, 2);
tex.LoadRawTextureData(colorArray.byteArray);
tex.Apply();
Like I said in my comment, it's better to convert the WebCamTexture
to Texture2D
then to jpeg or png then send it over the network. That will reduce the size of the image. See this answer for more information.