I have trained the language model using Tensorflow as given in thie tutorial
For training I used the following command.
bazel-bin/tensorflow/models/rnn/ptb/ptb_word_lm --data_path=./simple-examples/data/ --model small
The training was successful with the following o/p at the end.
Epoch: 13 Train Perplexity: 37.196
Epoch: 13 Valid Perplexity: 124.502
Test Perplexity: 118.624
But I am still confused with, where the training model is stored and how to use it.
The demo code probably did not include the ability to save a model; you might want to explicitly use
tf.train.Saver
to save and restore variables to and from checkpoints.See doc and examples.
It's pretty straightforward according to the doc. In below example, I saved all the variables in the model. Instead, you can choose which variable(s) to save by following the examples.
In my case, each checkpoint file has a disk size of 18.61M (
--model small
).Regarding how to use the model, just follow the doc to restore the checkpoints from saved files. Then it's at your will how to use it.