Short question: How can I select which checkpoint to view in TensorBoard's embeddings tab?
Longer version of the question:
I want to visualize word embeddings with TensorBoard. To that end, after reading the official tutorial (mirror) I added following code:
embedding_writer = tf.summary.FileWriter(model_folder)
embeddings_projector_config = projector.ProjectorConfig()
embedding = embeddings_projector_config.embeddings.add()
embedding.tensor_name = model.W.name # W corresponds to the embeddings' weights.
projector.visualize_embeddings(embedding_writer, embeddings_projector_config)
# Initialize the model
sess.run(tf.global_variables_initializer())
[...]
# Then, for each training epoch:
model_saver.save(sess, os.path.join(model_folder, 'model_{0:05d}.ckpt'.format(epoch_number)))
Looking at the folder where TensorFlow saves the log, I do have a checkpoint for each epoch:
However, in TensorBoard's embeddings tab it seems that I can only view the latest checkpoint:
I sometimes would like to view the embeddings for previous epochs. How can I select which checkpoint to view in TensorBoard's embeddings tab?
I'm one of the engineers working on the embedding visualizer. Thanks for the feedback. We are planning to add a dropdown menu in the UI that allows you to choose different checkpoints.
In the meantime, there is a workaround. You can edit the projector_config.pbtxt
that lives in the folder where TensorBoard saves the log. I'm assuming the contents of projector_config.pbtxt
are:
embeddings {
...
}
Append the following line at the end of the file:
model_checkpoint_path: "path_to_log_dir/model_0000N.ckpt"
pointing to the exact checkpoint you want to visualize, and remove (if it exists) the line model_checkpoint_dir: "..."
. Then refresh the page (and potentially re-run TensorBoard).
For example, if you have launched TensorBoard with tensorboard --logdir=output
, and the model checkpoint absolute path is C:\Users\a\output\en_2017-03-08_17-42-09-310106\model\model_00004.ckpt
, then you should append to projector_config.pbtxt
:
model_checkpoint_path: "output\en_2017-03-08_17-42-09-310106\model\model_00004.ckpt"
Example of projector_config.pbtxt
:
embeddings {
tensor_name: "token_embedding/W:0"
}
model_checkpoint_path: "output\en_2017-03-08_17-42-09-310106\model\model_00004.ckpt"
If when you click to the embeddings tab in TensorBoard nothing appears, it means that the model_checkpoint_path
you have entered is incorrect.
Hope this helps!