Perhaps this is elementary, but I cannot find a good example of using mahalanobis
distance in sklearn
.
I can't even get the metric like this:
from sklearn.neighbors import DistanceMetric
DistanceMetric.get_metric('mahalanobis')
This throws an error: TypeError: 0-dimensional array given. Array must be at least two-dimensional
.
But, I can't even seem to get it to take an array:
DistanceMetric.get_metric('mahalanobis', [[0.5],[0.7]])
throws:
TypeError: get_metric() takes exactly 1 positional argument (2 given)
I checked out the docs here and here. But, I don't see what types of arguments it is expecting.
Is there an example of using the Mahalanobis distance that I can see?
MahalanobisDistance
is expecting a parameter V
which is the covariance matrix, and optionally another parameter VI
which is the inverse of the covariance matrix. Furthermore, both of these parameters are named and not positional.
Also check the docstring for the class MahalanobisDistance
in the file scikit-learn/sklearn/neighbors/dist_metrics.pyx
in the sklearn repo.
Example:
In [18]: import numpy as np
In [19]: from sklearn.datasets import make_classification
In [20]: from sklearn.neighbors import DistanceMetric
In [21]: X, y = make_classification()
In [22]: DistanceMetric.get_metric('mahalanobis', V=np.cov(X))
Out[22]: <sklearn.neighbors.dist_metrics.MahalanobisDistance at 0x107aefa58>
Edit:
For some reasons (bug?), you can't pass the distance object to the NearestNeighbor
constructor, but need to use the name of the distance metric. Also, setting algorithm='auto'
(which defaults to 'ball_tree'
) doesn't seem to work; so given X
from the code above you can do:
In [23]: nn = NearestNeighbors(algorithm='brute',
metric='mahalanobis',
metric_params={'V': np.cov(X)})
# returns the 5 nearest neighbors of that sample
In [24]: nn.fit(X).kneighbors(X[0, :])
Out[24]: (array([[ 0., 3.21120892, 3.81840748, 4.18195987, 4.21977517]]),
array([[ 0, 36, 46, 5, 17]]))