Is there a way to use gradient boosting on regression using Vowpal Wabbit? I use various techniques that come with Vowpal Wabbit that are helpful. I want to try gradient boosting along with that, but I can't find a way to implement gradient boosting on VW.
相关问题
- How to conditionally scale values in Keras Lambda
- Trying to understand Pytorch's implementation
- ParameterError: Audio buffer is not finite everywh
- How to calculate logistic regression accuracy
- How to parse unstructured table-like data?
相关文章
- How to use cross_val_score with random_state
- How to measure overfitting when train and validati
- McNemar's test in Python and comparison of cla
- How to disable keras warnings?
- Invert MinMaxScaler from scikit_learn
- How should I vectorize the following list of lists
- ValueError: Unknown metric function when using cus
- F1-score per class for multi-class classification
The idea of gradient boosting is that an ensemble model is built from black-box weak models. You can surely use VW as the black box, but note that VW does not offer decision trees, which are the most popular choice for the black-box weak models in boosting. Boosting in general decreases bias (and increases variance), so you should make sure that the VW models have low variance (no overfitting). See bias-variance tradeoff.
There are some reductions related to boosting and bagging in VW:
--autolink N
adds a link function with polynomial N, which can be considered a simple way of boosting.--log_multi K
is an online boosting algorithm for K-class classification. See the paper. You can use it even for binary classification (K=2), but not for regression.--bootstrap M
M-way bootstrap by online importance resampling. Use--bs_type=vote
for classification and--bs_type=mean
for regression. Note that this is bagging, not boosting.--boosting N
(added on 2015-06-17) online boosting with N weak learners, see a theoretic paper