AdaBoostClassifier with different base learners

2019-02-04 12:12发布

问题:

I am trying to use AdaBoostClassifier with a base learner other than DecisionTree. I have tried SVM and KNeighborsClassifier but I get errors. Can some one point out the classifiers that can be used with AdaBoostClassifier?

回答1:

Ok, we have a systematic method to find out all the base learners supported by AdaBoostClassifier. Compatible base learner's fit method needs to support sample_weight, which can be obtained by running following code:

import inspect
from sklearn.utils.testing import all_estimators
for name, clf in all_estimators(type_filter='classifier'):
    if 'sample_weight' in inspect.getargspec(clf().fit)[0]:
       print name

This results in following output: AdaBoostClassifier, BernoulliNB, DecisionTreeClassifier, ExtraTreeClassifier, ExtraTreesClassifier, MultinomialNB, NuSVC, Perceptron, RandomForestClassifier, RidgeClassifierCV, SGDClassifier, SVC.

If the classifier doesn't implement predict_proba, you will have to set AdaBoostClassifier parameter algorithm = 'SAMME'.

Thanks to Andreas for showing how to list all estimators.



回答2:

You shouldnot use SVM with Adaboost. Adaboost should use weak-classifier. Using of classifiers like SVM will result in overfitting.



回答3:

Any classifier that supports passing sample weights should work. SVC is one such classifier. What specific error message (and traceback) do you get? Can you provide a minimalistic reproduction case for this error (e.g. as a http://gist.github.com )?