I'm using numpy.svd to compute singular value decompositions of badly conditioned matrices. For some special cases the svd won't converge and raise a Linalg.Error. I've done some research and found that numpy uses the DGESDD routine from LAPACK. The standard implementation has a hardcoded iteration limit of 35 or something iterations. If I try to decompose the same matrix in Matlab, everything works fine, and I think there's two reasons for that: 1. Matlab uses DGESVD instead of DGESDD which in general seems to be more robust. 2. Matlab uses an iteration limit of 75 in the routine. (They changed it in the source and recompiled it.)
Now the question is: Is there a simple way to change the used backend in numpy from DGESDD to DGESVD without having to modify the numpy source ?
Thanks in advance Mischa
I'm a little late, but maybe this will help someone else...
I had a similar problem in julia.
I found this approach from the R help list, which should work for any environment using the lapack library:
Basically, if svd(M) fails, try svd(M'), and swap the resulting U,V appropriately.
Here's how I'm doing it in julia:
What worked for me was to only compute the "economy size" SVD of that matrix
X
: