I am dealing with highly imbalanced data set and my idea is to obtain values of feature weights from my libSVM model. As for now I am OK with the linear kernel, where I can obtain feature weights, but when I am using rbf
or poly
, I fail to reach my objective.
Here I am using sklearn
for my model and it's easy to obtain feature weights for linear kernel using .coef_
. Can anyone help me to do same thing for rbf
or poly
? What I've tried to do so far is given below:
svr = SVC(C=10, cache_size=200, class_weight='auto', coef0=0.0, degree=3.0, gamma=0.12,kernel='rbf', max_iter=-1, probability=True, random_state=0,shrinking=True, tol=0.001, verbose=False)
clf = svr.fit(data_train,target_train)
print clf.coef_
This is not only impossible, as stated in the documentation:
but also it doesn't make sense. In linear SVM the resulting separating plane is in the same space as your input features. Therefore its coefficients can be viewed as weights of the input's "dimensions".
In other kernels, the separating plane exists in another space - a result of kernel transformation of the original space. Its coefficients are not directly related to the input space. In fact, for the
rbf
kernel the transformed space is infinite-dimensional (you can get a starting point on this on Wikipedia of course).