I saw this on the OpenGL site:
OpenGL Shading Language attribute variables are allowed to be of type mat2, mat3, or mat4. Attributes of these types may be loaded using the glVertexAttrib entry points. Matrices must be loaded into successive generic attribute slots in column major order, with one column of the matrix in each generic attribute slot.
I have been successively using a vector library that is based on row-major. That is, the 16 elements of 4x4 matrix are laid out like this in a struct:
vec4 x;
vec4 y;
vec4 z;
vec4 w;
Where each vec4
has components x, y, z, w
, thus the linear order of the contents is row-major since the first row occupies the first 4 positions in memory, and matrix math is handled as such, for example this rotate transform:
static Matrix4<T> RotateZ(T degrees)
{
T radians = degrees * 3.14159f / 180.0f;
T s = std::sin(radians);
T c = std::cos(radians);
Matrix4 m;
m.x.x = c; m.x.y = s; m.x.z = 0; m.x.w = 0;
m.y.x = -s; m.y.y = c; m.y.z = 0; m.y.w = 0;
m.z.x = 0; m.z.y = 0; m.z.z = 1; m.z.w = 0;
m.w.x = 0; m.w.y = 0; m.w.z = 0; m.w.w = 1;
return m;
}
The library is row-major to support left-to-right computations when writing the client-side code, which is natural for a lot of programmers. But what I don't understand is that these row-major matrices are getting sent as-is to the shaders, which according to the above, expect column-major.
I understand that the shader performs calculations right-to-left. But shouldn't a row-major matrix be transposed so it is column-major before being fed into a shader? I have not been doing this, yet it is working just fine.