How to add learning rate to summaries?

2019-07-14 03:34发布

问题:

How do I monitor learning rate of AdamOptimizer? In TensorBoard: Visualizing Learning is said that I need

Collect these by attaching scalar_summary ops to the nodes that output the learning rate and loss respectively.

How can I do this?

回答1:

I think something like following inside the graph would work fine:

with tf.name_scope("learning_rate"):
    global_step = tf.Variable(0)
    decay_steps = 1000 # setup your decay step
    decay_rate = .95 # setup your decay rate
    learning_rate = tf.train.exponential_decay(0.01, global_step, decay_steps, decay_rate, staircase=True, "learning_rate")
tf.scalar_summary('learning_rate', learning_rate)

(Of course to make it work, it'd require to tf.merge_all_summaries() and use tf.train.SummaryWriter to write the summaries to the log in the end)