I'm using the GridSearchCV object to train a classifier. I setup 5-fold validation parameter search and after calling fit(), I need to see the metrics for each fold's validation set, namely accuracy and f1-score. How can I do this?
clf = GridSearchCV(pipeline,
param_grid=param_grid,
n_jobs=1,
cv=5,
compute_training_score=True)
Note:
- I don't have a separate testing set to use so I can't just take the result of predict and do it with the standard metrics functions.
- using the clf.best_scores_ doesn't give the information I want, only the mean_validation_score and its standard deviation.
Scores are located in
grid_scores_
, in particular incv_validation_scores
:However you will not get two metrics. The whole point of such optimizers is to maximize some single metric/scorer function, thus only this thing is stored inside of an object. In order to get such, you will need to run it twice, each time with different score function.