In my code in which X and y are the training data:
from sklearn.svm import SVC
clf = SVC(kernel=lambda x,y:gauss_kernel(x, y, 100) )
print(X.shape[0])
print(X.shape[1])
print(X.shape)
clf.fit(X, y)
I get the following error:
211
2
(211, 2)
/Users/mona/anaconda/lib/python3.6/site-packages/sklearn/utils/validation.py:547: DataConversionWarning: A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples, ), for example using ravel().
y = column_or_1d(y, warn=True)
---------------------------------------------------------------------------
IndexError Traceback (most recent call last)
<ipython-input-23-1f163ab380a5> in <module>()
8 print(X.shape)
9
---> 10 clf.fit(X, y)
11 plot_data()
12 plot_boundary(svm,-.5,.3,-.8,.6)
~/anaconda/lib/python3.6/site-packages/sklearn/svm/base.py in fit(self, X, y, sample_weight)
185
186 seed = rnd.randint(np.iinfo('i').max)
--> 187 fit(X, y, sample_weight, solver_type, kernel, random_seed=seed)
188 # see comment on the other call to np.iinfo in this file
189
~/anaconda/lib/python3.6/site-packages/sklearn/svm/base.py in _dense_fit(self, X, y, sample_weight, solver_type, kernel, random_seed)
226 X = self._compute_kernel(X)
227
--> 228 if X.shape[0] != X.shape[1]:
229 raise ValueError("X.shape[0] should be equal to X.shape[1]")
230
IndexError: tuple index out of range
Here's the customized Gaussian Kernel I wrote:
import math
def gauss_kernel(x1, x2, gamma):
sigma = math.sqrt(gamma)
return np.exp(-np.sum((x1-x2)**2)/(2*sigma**2))
How should I fix this? When I look at SVM examples in sklearn, they basically do the same thing. I believe I am neglecting something small but can't pin down the problem when matching with sklearn examples.
Please make sure that the output of your custom kernel is a square matrix.
Currently your implementation of gauss_kernel
will return a number, not an array. So calling shape[0] or shape[1] throws the "tuple index out of range error".
So fix that:
import math
def gauss_kernel(x1, x2):
sigma = math.sqrt(100)
return np.array([np.exp(-np.sum((x1-x2)**2)/(2*sigma**2))])
And then use your code.
Note: This is just a workaround for wrapping a single number to an array. You should check whats wrong with your original gauss_kernel
that it returns the single number.
from sklearn import svm
def gauss_kernel(x1, x2, gamma):
x1 = x1.flatten()
x2 = x2.flatten()
sigma = math.sqrt(gamma)
return np.exp(-np.sum((x1-x2)**2)/(2*sigma**2))
# from @lejlot http://stackoverflow.com/a/26962861/583834
def gaussianKernelGramMatrix(X1, X2, K_function=gauss_kernel, gamma=0.1):
"""(Pre)calculates Gram Matrix K"""
gram_matrix = np.zeros((X1.shape[0], X2.shape[0]))
for i, x1 in enumerate(X1):
for j, x2 in enumerate(X2):
gram_matrix[i, j] = K_function(x1, x2, gamma)
return gram_matrix
gamma=0.1
y = y.flatten()
clf = svm.SVC(kernel="precomputed", verbose=2, C=2.0, probability=True)
clf.fit(gaussianKernelGramMatrix(X,X, gauss_kernel, gamma=gamma), y)
Today I do coursera homework ex6, I have the same problem. now I solve this.
The sklearn use custom kernel request kernel function returns the new [m*m] matrix the code like this:
def _compute_kernel(self, X):
"""Return the data transformed by a callable kernel"""
if callable(self.kernel):
# in the case of precomputed kernel given as a function, we
# have to compute explicitly the kernel matrix
kernel = self.kernel(X, self.__Xfit)
if sp.issparse(kernel):
kernel = kernel.toarray()
X = np.asarray(kernel, dtype=np.float64, order='C')
return X
so I define the kernel function return the matrix, it can compute x1=[m,n] and x2=[h,n] Euclidean distance, then use the exp compute the return value.
def gaussianKernel(x1: ndarray, x2: ndarray, sigma):
# RBFKERNEL returns a radial basis function kernel between x1 and x2
# sim = gaussianKernel(x1, x2) returns a gaussian kernel between x1 and x2
# and returns the value in sim
# Ensure that x1 and x2 are column vectors
m = size(x1, 0)
n = size(x2, 0)
# You need to return the following variables correctly.
sim = 0
# ====================== YOUR CODE HERE ======================
# Instructions: Fill in this function to return the similarity between x1
# and x2 computed using a Gaussian kernel with bandwidth
# sigma
#
# Note: use the matrix compute the distence
M = x1@x2.T
H1 = sum(square(mat(x1)), 1) # [m,1]
H2 = sum(square(mat(x2)), 1) # [n,1]
D = H1+H2.T-2*M
sim = exp(-D/(2*sigma*sigma))
# =============================================================
return sim
now add following lines code in the main function:
def mykernel(x1, x2): return gaussianKernel(x1, x2, sigma)
model = svm.SVC(C, kernel=mykernel) # type:SVC
model.fit(X, y.ravel())
visualizeBoundary(X, y, model)
final the plot :
visualizeBoundary