Several of my peers have mentioned that "linear algebra" is very important when studying algorithms. I've studied a variety of algorithms and taken a few linear algebra courses and I don't see the connection. So how is linear algebra used in algorithms?
For example what interesting things can one with a connectivity matrix for a graph?
Three concrete examples:
Basically it comes down to the fact that linear algebra is a very powerful method when dealing with multiple variables, and there's enormous benefits for using this as a theoretical foundation when designing algorithms. In many cases this foundation isn't as appearent as you might think, but that doesn't mean that it isn't there. It's quite possible that you've already implemented algorithms which would have been incredibly hard to derive without linalg.
Linear algebra is also important in many algorithms in computer algebra, as you might have guessed. For example, if you can reduce a problem to saying that a polynomial is zero, where the coefficients of the polynomial are linear in the variables
x1, …, xn
, then you can solve for what values ofx1, …, xn
make the polynomial equal to 0 by equating the coefficient of eachx^n
term to 0 and solving the linear system. This is called the method of undetermined coefficients, and is used for example in computing partial fraction decompositions or in integrating rational functions.For the graph theory, the coolest thing about an adjacency matrix is that if you take the nth power of an adjacency Matrix for an unweighted graph (each entry is either 0 or 1),
M^n
, then each entryi,j
will be the number of paths from vertexi
to vertexj
of lengthn
. And if that isn't just cool, then I don't know what is.Ha, I can't resist putting this here (even though the other answers are good):
The $25 billion dollar eigenvector.
I'm not going to lie... I never even read the whole thing... maybe I will now :-).
All of the answers here are good examples of linear algebra in algorithms.
As a meta answer, I will add that you might be using linear algebra in your algorithms without knowing it. Compilers that optimize with SSE(2) typically vectorize your code by having many data values manipulated in parallel. This is essentially elemental LA.
A cryptographer would probably tell you that a grasp of number theory is very important when studying algorithms. And he'd be right--for his particular field. Statistics has its uses too--skip lists, hash tables, etc. The usefulness of graph theory is even more obvious.
There's no inherent link between linear algebra and algorithms; there's an inherent link between mathematics and algorithms.
Linear algebra is a field with many applications, and the algorithms that draw on it therefore have many applications as well. You've not wasted your time studying it.
A lot of algebraic properties of the matrix are invariant under permutations of vertices (for example abs(determinant)), so if two graphs are isomorphic, their values will be equal.
This is a source for good heuristics for determining whether two graphs are not isomorphic, since of course equality does not guarantee existance of isomorphism.
Check algebraic graph theory for a lot of other interesting techniques.