What is the best way to calculate sum of matrices such as A^i + A^(i+1) + A^i+2........A^n for very large n?
I have thought of two possible ways:
1) Use logarithmic matrix exponentiation(LME) for A^i, then calculate the subsequent matrices by multiplying by A.
Problem: Doesn't really take advantage of the LME algorithm as i am using it only for the lowest power!!
2)Use LME for finding A^n and memoize the intermediate calculations.
Problem: Too much space required for large n.
Is there a third way?
Notice that:
Let
We have:
So you will be doing a logarithmic number of steps and there is no need to compute inverses.
While a direct implementation will lead to a
(log n)^2
factor, you can keep it atlog n
by computing the powers ofA
as you computeB
.You can use the fact that the sum to
n
of a geometric series of matrices equals:and, since you don't start from 0, you can simply calculate:
where
i
is your starting index.Why not just diagonalize the matrix to make the multiplication cheap.
edit:
As long as the matrix is nonsingular you should be able to find a diagonal representation D of matrix A such that A = PDP^-1 where P is made up of the eigenvectors of A, and D has the eigenvalues of A along the diagonal. Getting D^m = D*D^(m-1) is cheap since it's you're only multiplying along the diagonal (i.e. the same number of multiplications as the dimension of the matrix)
Getting S(m)=S(m-1)+D^m is also cheap since you're only adding diagonal elements.
Then you have
A^i + A^(i+1) + A^i+2........A^n = P(D^i + D^(i+1) + D^i+2........D^n)P^-1 = P( S(n) - S(i) )P^-1
The only difficult thing is finding P and P^-1