A linear algebra question;
Given a k-variate normed vector u (i.e. u : ||u||_2=1) how do you construct \Gamma_u, any arbitrary k*(k-1) matrix of unit vectors such that (u,\Gamma_u) forms an orthogonal basis ?
I mean: from a computationnal stand point of view: what algorithm do you use to construct such matrices ?
Thanks in advance,
The naive approach would be to apply Gram Schmidt orthogonalisation of u_0, and k-1 randomly generated vectors. If at some point the GS algorithm generates a zero vector, then you have a linear dependency in which case choose the vector randomly again.
However this method is unstable, small numerical errors in the representation of the vectors gets magnified. However there exists a stable modification of this algorithm:
Let
a_1 = u, a_2,...a_k
be randomly chosen vectorsThe resulting vectors
v1,...vk
will be the columns of your matrix, withv1 = u
. If at some pointvj
becomes zero choose a new vectoraj
and start again. Note that the probability of this happening is negligible if the vectors a2,..,ak are chosen randomly.You can use Householder matrices to do this. See for example http://en.wikipedia.org/wiki/Householder_reflection and http://en.wikipedia.org/wiki/QR_decomposition
One can find a Householder matrix
Q
so thatQ*u = e_1
(wheree_k
is the vector that's all 0s apart from a 1 in the k-th place) Then iff_k = Q*e_k
, thef_k
form an orthogonal basis andf_1 = u
. (SinceQ*Q = I
, and Q is orthogonal.)All this talk of matrices might make it seem that the routine would be expensive, but this is not so. For example this C function, given a vector of length 1 returns an array with the required basis in column order, ie the j'th component of the i'th vector is held in b[j+dim*i]