In this question six months ago, jez was nice enough to help me come up with a fast approximation for the outer product of row differences, i.e.:
K = np.zeros((len(X), len(X)))
for i, Xi in enumerate(X):
for j, Xj in enumerate(X):
dij = Xi - Xj
K += np.outer(dij, dij)
That worked for finding a scatter matrix calculation for a form of Fisher Discriminant Analysis. But now I am trying to do Local Fisher Discriminant Analysis, where each outer product is weighted by a matrix A that has information about the pair's locality, so the new line is:
K += A[i][j] * np.outer(dij, dij)
Unfortunately, the quick way to calculate the unweighted scatter matrix presented in the previous answer doesn't work for this, and as far as I can tell it's not easy to make the quick change.
Linear Algebra is definitely not my strong suit, I'm not good at coming up with these kinds of things. What is a fast way to calculate the weighted sum of pairwise row-difference outer products?
Here is a way to vectorize the calculation you specified. If you do a lot of this kind of thing, then it may be worth learning how to use, "numpy.tensordot". It multiplies all elements according to standard numpy broadcasting, and then it sums over the pairs of axes given with the kwrd, "axes".
Here is the code:
My first attempt (fbetter) made a large temporary array of size NxNxN. The second attempt (fbest) never makes anything bigger than NxN. This works pretty good up to N~1000.
Also, the code runs faster when the output array is smaller.
I have MKL installed so the calls to tensordot are really fast and run in parallel.
Thanks for the question. This was a nice exercise and reminded me how important it is to avoid making large temporary arrays.