How can I print the Learning Rate at each epoch wi

2019-06-01 19:45发布

Because online learning does not work well with Keras when you are using an adaptive optimizer (the learning rate schedule resets when calling .fit()), I want to see if I can just manually set it. However, in order to do that, I need to find out what the learning rate was at the last epoch.

That said, how can I print the learning rate at each epoch? I think I can do it through a callback but it seems that you have to recalculate it each time and I'm not sure how to do that with Adam.

I found this in another thread but it only works with SGD:

class SGDLearningRateTracker(Callback):
    def on_epoch_end(self, epoch, logs={}):
        optimizer = self.model.optimizer
        lr = K.eval(optimizer.lr * (1. / (1. + optimizer.decay * optimizer.iterations)))
        print('\nLR: {:.6f}\n'.format(lr))

2条回答
孤傲高冷的网名
2楼-- · 2019-06-01 20:33

This piece of code might help you. It is based on Keras implementation of Adam optimizer (beta values are Keras defaults)

from keras import Callback
from keras import backend as K
class AdamLearningRateTracker(Callback):
    def on_epoch_end(self, logs={}):
        beta_1=0.9, beta_2=0.999
        optimizer = self.model.optimizer
        if optimizer.decay>0:
            lr = K.eval(optimizer.lr * (1. / (1. + optimizer.decay * optimizer.iterations)))
        t = K.cast(optimizer.iterations, K.floatx()) + 1
        lr_t = lr * (K.sqrt(1. - K.pow(beta_2, t)) /(1. - K.pow(beta_1, t)))
        print('\nLR: {:.6f}\n'.format(lr_t))
查看更多
Melony?
3楼-- · 2019-06-01 20:34
class MyCallback(Callback):
    def on_epoch_end(self, epoch, logs=None):
        lr = self.model.optimizer.lr
        # If you want to apply decay.
        decay = self.model.optimizer.decay
        iterations = self.model.optimizer.iterations
        lr_with_decay = lr / (1. + decay * K.cast(iterations, K.dtype(decay)))
        print(K.eval(lr_with_decay))

Follow this thread.

查看更多
登录 后发表回答