XGBoost CV and best iteration

2019-04-12 02:12发布

I am using XGBoost cv to find the optimal number of rounds for my model. I would be very grateful if someone could confirm (or refute), the optimal number of rounds is:

    estop = 40
    res = xgb.cv(params, dvisibletrain, num_boost_round=1000000000, nfold=5, early_stopping_rounds=estop, seed=SEED, stratified=True)

    best_nrounds = res.shape[0] - estop
    best_nrounds = int(best_nrounds / 0.8)

i.e: the total number of rounds completed is res.shape[0], so to get the optimal number of rounds, we subtract the number of early stopping rounds.

Then, we scale up the number of rounds, based on the fraction used for validation. Is that correct?

2条回答
可以哭但决不认输i
2楼-- · 2019-04-12 02:57

You can have the best iteration number via the 'res.best_iteration'

查看更多
Bombasti
3楼-- · 2019-04-12 03:09

Yep, it sounds correct if when you do best_nrounds = int(best_nrounds / 0.8) you consider that your validation set was 20% of your whole training data (another way of saying that you performed a 5-fold cross-validation).

The rule can then be generalized as:

n_folds = 5
best_nrounds = int((res.shape[0] - estop) / (1 - 1 / n_folds))

Or if you don't perform CV but a single validation:

validation_slice = 0.2
best_nrounds = int((res.shape[0] - estop) / (1 - validation_slice))

You can see an example of this rule being applied here on Kaggle (see the comments).

查看更多
登录 后发表回答