Because online learning does not work well with Keras when you are using an adaptive optimizer (the learning rate schedule resets when calling .fit()
), I want to see if I can just manually set it. However, in order to do that, I need to find out what the learning rate was at the last epoch.
That said, how can I print the learning rate at each epoch? I think I can do it through a callback but it seems that you have to recalculate it each time and I'm not sure how to do that with Adam.
I found this in another thread but it only works with SGD:
class SGDLearningRateTracker(Callback):
def on_epoch_end(self, epoch, logs={}):
optimizer = self.model.optimizer
lr = K.eval(optimizer.lr * (1. / (1. + optimizer.decay * optimizer.iterations)))
print('\nLR: {:.6f}\n'.format(lr))
This piece of code might help you. It is based on Keras implementation of Adam optimizer (beta values are Keras defaults)
Follow this thread.