I have a rendered video atlas thats half rgb image half depth map. Currently it's a 32bit exr sequence. The video then needs to be compressed into an h264 stream and I want to play it back as a texture in unity and read the depthmap and apply it as a displacement on the rgb half of image.
The issue is when going to an 8bit color space that means I am limited to only 256 steps of depth. Looking at how the xbox kinect displays it's depth values, it mixes them in a rainbow instead, which would contain much more resolution per pixel.
I want to figure out the math of mapping a 0->1.0 grayscale image into R:0->1.0 G:0->1.0 B:0->1.0 rainbow image that i can then render out into an 8bit color h264. Then I need to reverse that math to convert it back into a grayscale map in unity and use it as a displacement map.
I am using the foundry's nuke for creating the depthmap.