Short version of my question:
What would be the optimal way of calculating an eigenvector for a matrix A
, if we already know the eigenvalue belonging to the eigenvector?
Longer explanation:
I have a large stochastic matrix A
which, because it is stochastic, has a non-negative left eigenvector x
(such that A^Tx=x
).
I'm looking for quick and efficient methods of numerically calculating this vector. (Preferrably in MATLAB or numpy/scipy - since both of these wrap around ARPACK/LAPACK, any one would be fine).
I know that 1
is the largest eigenvalue of A
, so I know that calling something like this Python code:
from scipy.sparse.linalg import eigs
vals, vecs = eigs(A, k=1)
will result in vals = 1
and vecs
equalling the vector I need.
However, the thing that bothers me here is that calculating eigenvalues is, in general, a more difficult operation than solving a linear system, and, in general, if a matrix M
has eigenvalue l
, then finding the appropriate eigenvector is a matter of solving the equation (M - 1 * I) * x = 0
, which is, in theory at least, an operation that is simpler than calculating an eigenvalue, since we are only solving a linear system, more specifically, finding the nullspace of a matrix.
However, I find that all methods of nullspace calculation in MATLAB
rely on svd
calculation, a process I cannot afford to perform on a matrix of my size. I also cannot call solvers on the linear equation, because they all only find one solution, and that solution is 0
(which, yes, is a solution, but not the one I need).
Is there any way to avoid calls to eigs
-like function to solve my problem more quickly than by calculating the largest eigenvalue and accompanying eigenvector?
Here's one approach:
Example using Matlab:
Note that the code
-A(1, 2:end)/(A(2:end, 2:end)-eye(size(A,1)-1))
is step 3.In your formulation you define x to be a (column) right eigenvector of AT (such that ATx = x). This is just
x.'
from the above code:You can of course normalize the eigenvector to sum 1:
† Following the usual convention. Equivalently, this corresponds to a right eigenvector of the transposed matrix; see below.