I looked at this question: Convert a 1d array index to a 3d array index? but my result differs when I test it no matter how I calculate the index in the first place.
I am using the following to convert 3d (x,y,z) in dimensions (WIDTH, HEIGHT, DEPTH) to 1d:
i=x+WIDTH(y+HEIGHT*z)
Then to convert back from index i to (x,y,z):
z=i/(HEIGHT*WIDTH)
y=(i/WIDTH)%HEIGHT
x=i%(WIDTH)
When I test this between both methods I get the wrong results. Is there something obviously wrong with my method? What is the correct explanation?
here's a javascript version that works with these values:
Code: