I'm fitting a train_generator and by means of a custom callback I want to compute custom metrics on my validation_generator.
How can I access params validation_steps
and validation_data
within a custom callback?
It’s not in self.params
, can’t find it in self.model
either. Here's what I'd like to do. Any different approach'd be welcomed.
model.fit_generator(generator=train_generator,
steps_per_epoch=steps_per_epoch,
epochs=epochs,
validation_data=validation_generator,
validation_steps=validation_steps,
callbacks=[CustomMetrics()])
class CustomMetrics(keras.callbacks.Callback):
def on_epoch_end(self, batch, logs={}):
for i in validation_steps:
# features, labels = next(validation_data)
# compute custom metric: f(features, labels)
return
keras: 2.1.1
Update
I managed to pass my validation data to a custom callback's constructor. However, this results in an annoying "The kernel appears to have died. It will restart automatically." message. I doubt if this is the right way to do it. Any suggestion?
class CustomMetrics(keras.callbacks.Callback):
def __init__(self, validation_generator, validation_steps):
self.validation_generator = validation_generator
self.validation_steps = validation_steps
def on_epoch_end(self, batch, logs={}):
self.scores = {
'recall_score': [],
'precision_score': [],
'f1_score': []
}
for batch_index in range(self.validation_steps):
features, y_true = next(self.validation_generator)
y_pred = np.asarray(self.model.predict(features))
y_pred = y_pred.round().astype(int)
self.scores['recall_score'].append(recall_score(y_true[:,0], y_pred[:,0]))
self.scores['precision_score'].append(precision_score(y_true[:,0], y_pred[:,0]))
self.scores['f1_score'].append(f1_score(y_true[:,0], y_pred[:,0]))
return
metrics = CustomMetrics(validation_generator, validation_steps)
model.fit_generator(generator=train_generator,
steps_per_epoch=steps_per_epoch,
epochs=epochs,
validation_data=validation_generator,
validation_steps=validation_steps,
shuffle=True,
callbacks=[metrics],
verbose=1)
You can iterate directly over self.validation_data to aggregate all the validation data at the end of each epoch. If you want to calculate precision, recall and F1 across the complete validation dataset:
Then you can add valid_metrics to the callback argument:
Be sure to put it at the beginning of the callbacks in case you want other callbacks to use these measures.
Here's how:
Reference
Keras 2.2.4
I was locking for solution for the same problem, then I find yours and another solution in the accepted answer here. If the second solution work, I think it will be better than iterating thorough all validation again at " on epoch end"
The idea is to save the target and pred placeholders in variables and update the variables through custom callback at " on batch end"