There seem to be several threads/issues on this already but it doesn't appear to me that this has been solved:
How can I use tensorflow metric function within keras models?
https://github.com/fchollet/keras/issues/6050
https://github.com/fchollet/keras/issues/3230
People seem to either run into problems around variable initialization or the metric being 0.
I need to calculate different segmentation metrics and would like to include tf.metric.mean_iou in my Keras model. This is the best I have been able to come up with so far:
def mean_iou(y_true, y_pred):
score, up_opt = tf.metrics.mean_iou(y_true, y_pred, NUM_CLASSES)
K.get_session().run(tf.local_variables_initializer())
return score
model.compile(optimizer=adam, loss='categorical_crossentropy', metrics=[mean_iou])
This code does not throw any errors but mean_iou always returns 0. I believe this is because up_opt is not evaluated. I have seen that prior to TF 1.3 people have suggested to use something along the lines of control_flow_ops.with_dependencies([up_opt], score) to achieve this. This does not seem possible in TF 1.3 anymore.
In summary, how do I evaluate TF 1.3 metrics in Keras 2.0.6? This seems like quite an important feature.
You can use the following decorator to wrap tf.metrics into keras.metrics:
Basic usage:
Compile the keras model:
Function decoration:
Be aware of inconsistency with arguments (e.g. order of
y_pred
,y_true
). Decorate your way out of it:You can also use the decorator to set default parameters (e.g. for
mean_iou
):For metrics that returns multiple values I expect that the mean value is taken during each epoch evaluation. For instance, the use of two threshold values in
precision_at_thresholds
returns two values, but as far as I can see, Keras reports the average during training.Caveats:
Be aware, as fchollet has commented, that this is a hack. It might produce unwanted results, since Keras does not support tensorflow metrics.
you can still use
control_dependencies