-->

Tensorflow 1.4 tf.metrics.auc for AUC calculation

2019-06-14 07:03发布

问题:

I am trying to log AUC during training time of my model.

According to the documentation, tf.metric.auc needs a label and predictions, both of same shape.

But in my case of binary classification, label is a one-dimensional tensor, containing just the classes. And prediction is two-dimensional containing probability for each class of each datapoint.

How to calculate AUC in this case?

回答1:

Let's have a look at the parameters in the function tf.metrics.auc:

  • labels: A Tensor whose shape matches predictions. Will be cast to bool.
  • predictions: A floating point Tensor of arbitrary shape and whose values are in the range [0, 1].

This operation already assumes a binary classification. That is, each element in labels states whether the class is "positive" or "negative" for a single sample. It is not a 1-hot vector, which requires a vector with as many elements as the number of exclusive classes.

Likewise, predictions represents the predicted binary class with some level of certainty (some people may call it a probability), and each element should also refer to one sample. It is not a softmax vector.

If the probabilities came from a neural network with a fully connected layer of 2 neurons and a softmax activation at the head of the network, consider replacing that with a single neuron and a sigmoid activation. The output can now be fed to tf.metrics.auc directly.

Otherwise, you can just slice the predictions tensor to only consider the positive class, which will represent the binary class just the same:

auc_value, auc_op = tf.metrics.auc(labels, predictions[:, 1])