I am implementing some spectral graph algorithms for a project. A large part of this is finding eigenvalues and eigenvectors of large, sparse matrices, as well as multiplying matrices.
My question is, what are the fastest libraries that do this? I've been looking at NumPy for Python or JAMA for Java. Are these good, or is there something better?
Thank you.
First of all you need to specify exactly which matrix operations you need to perform. The reason is that some of the libraries are very good at a few special operations. As an example Arpack is good at finding the largest few eigenvalues of a large sparse matrix. (See my questions above)
But in general NumPy/SciPy is a good choice. It wraps several libraries such as lapack, arpack, and superLU and it gives you a nice python interface to work with.
Alternatively you can use octave or MATLAB or use c++ to wrap a library that is specialized in the operations that you need to perform.
Jama was build for dense matrices, and you want to work with sparse matrices, so Jama is a bad choice for you. http://math.nist.gov/javanumerics/jama/
EDIT: I am not an expert on all this, but as far as I know you need to find a library that uses Lanczos algorithm (http://en.wikipedia.org/wiki/Lanczos_algorithm )
The Arpack library uses this algorithm, so Arpack is a good choice. The python library scipy.sparse.linalg wraps Arpack, so scipy is also a good choice.
For the record Lapack was also created for dense matrices, so Lapack is a bad choice unless your matrices are so small that speed doesn't matter. I believe that eispack is been outdated for years.
Jack Dongerra is a recognised authority in this area: FREELY AVAILABLE SOFTWARE FOR LINEAR ALGEBRA ON THE WEB
The most popular open source libraries for eigenvalue computation are LAPACK and EISPACK, and ARPACK