There seem to be several threads/issues on this already but it doesn't appear to me that this has been solved:
How can I use tensorflow metric function within keras models?
https://github.com/fchollet/keras/issues/6050
https://github.com/fchollet/keras/issues/3230
People seem to either run into problems around variable initialization or the metric being 0.
I need to calculate different segmentation metrics and would like to include tf.metric.mean_iou in my Keras model. This is the best I have been able to come up with so far:
def mean_iou(y_true, y_pred):
score, up_opt = tf.metrics.mean_iou(y_true, y_pred, NUM_CLASSES)
K.get_session().run(tf.local_variables_initializer())
return score
model.compile(optimizer=adam, loss='categorical_crossentropy', metrics=[mean_iou])
This code does not throw any errors but mean_iou always returns 0. I believe this is because up_opt is not evaluated. I have seen that prior to TF 1.3 people have suggested to use something along the lines of control_flow_ops.with_dependencies([up_opt], score) to achieve this. This does not seem possible in TF 1.3 anymore.
In summary, how do I evaluate TF 1.3 metrics in Keras 2.0.6? This seems like quite an important feature.
You can use the following decorator to wrap tf.metrics into keras.metrics:
def as_keras_metric(method):
import functools
from keras import backend as K
import tensorflow as tf
@functools.wraps(method)
def wrapper(self, args, **kwargs):
""" Wrapper for turning tensorflow metrics into keras metrics """
value, update_op = method(self, args, **kwargs)
K.get_session().run(tf.local_variables_initializer())
with tf.control_dependencies([update_op]):
value = tf.identity(value)
return value
return wrapper
Basic usage:
auc_roc = as_keras_metric(tf.metrics.auc)
recall = as_keras_metric(tf.metrics.recall)
...
Compile the keras model:
model.compile(..., metrics=[auc_roc])
Function decoration:
Be aware of inconsistency with arguments (e.g. order of y_pred
, y_true
). Decorate your way out of it:
@as_keras_metric
def auc_roc(y_true, y_pred):
return tf.contrib.metrics.streaming_auc(y_pred, y_true)
You can also use the decorator to set default parameters (e.g. for mean_iou
):
@as_keras_metric
def mean_iou(y_true, y_pred, num_classes=2):
return tf.metrics.mean_iou(y_true, y_pred, num_classes)
For metrics that returns multiple values I expect that the mean value is taken during each epoch evaluation. For instance, the use of two threshold values in precision_at_thresholds
returns two values, but as far as I can see, Keras reports the average during training.
@as_keras_metric
def precision_at_thresholds(y_true, y_pred, thresholds=[0.25, 0.50]):
return tf.metrics.precision_at_thresholds(y_true, y_pred, thresholds)
Caveats:
Be aware, as fchollet has commented, that this is a hack. It might produce unwanted results, since Keras does not support tensorflow metrics.
you can still usecontrol_dependencies
def mean_iou(y_true, y_pred):
score, up_opt = tf.metrics.mean_iou(y_true, y_pred, NUM_CLASSES)
K.get_session().run(tf.local_variables_initializer())
with tf.control_dependencies([up_opt]):
score = tf.identity(score)
return score