I created a language model with Keras LSTM and now I want to assess wether it's good so I want to calculate perplexity.
What is the best way to calc perplexity of a model in Python?
I created a language model with Keras LSTM and now I want to assess wether it's good so I want to calculate perplexity.
What is the best way to calc perplexity of a model in Python?
I've come up with two versions and attached their corresponding source, please feel free to check the links out.
def perplexity_raw(y_true, y_pred):
"""
The perplexity metric. Why isn't this part of Keras yet?!
https://stackoverflow.com/questions/41881308/how-to-calculate-perplexity-of-rnn-in-tensorflow
https://github.com/keras-team/keras/issues/8267
"""
# cross_entropy = K.sparse_categorical_crossentropy(y_true, y_pred)
cross_entropy = K.cast(K.equal(K.max(y_true, axis=-1),
K.cast(K.argmax(y_pred, axis=-1), K.floatx())),
K.floatx())
perplexity = K.exp(cross_entropy)
return perplexity
def perplexity(y_true, y_pred):
"""
The perplexity metric. Why isn't this part of Keras yet?!
https://stackoverflow.com/questions/41881308/how-to-calculate-perplexity-of-rnn-in-tensorflow
https://github.com/keras-team/keras/issues/8267
"""
cross_entropy = K.sparse_categorical_crossentropy(y_true, y_pred)
perplexity = K.exp(cross_entropy)
return perplexity