I am using XGBoost cv to find the optimal number of rounds for my model. I would be very grateful if someone could confirm (or refute), the optimal number of rounds is:
estop = 40
res = xgb.cv(params, dvisibletrain, num_boost_round=1000000000, nfold=5, early_stopping_rounds=estop, seed=SEED, stratified=True)
best_nrounds = res.shape[0] - estop
best_nrounds = int(best_nrounds / 0.8)
i.e: the total number of rounds completed is res.shape[0], so to get the optimal number of rounds, we subtract the number of early stopping rounds.
Then, we scale up the number of rounds, based on the fraction used for validation. Is that correct?
You can have the best iteration number via the 'res.best_iteration'
Yep, it sounds correct if when you do
best_nrounds = int(best_nrounds / 0.8)
you consider that your validation set was 20% of your whole training data (another way of saying that you performed a 5-fold cross-validation).The rule can then be generalized as:
Or if you don't perform CV but a single validation:
You can see an example of this rule being applied here on Kaggle (see the comments).