In my structure of NN, I wanna use different learning rate or optimizer , e.g. AdaGrad, in each layer. How to implement it? Wait for your help. Thks.
相关问题
- neural network does not learn (loss stays the same
- Convolutional Neural Network seems to be randomly
- How to convert Onnx model (.onnx) to Tensorflow (.
- XOR Java Neural Network
- Training with dropout
相关文章
- how to flatten input in `nn.Sequential` in Pytorch
- Looping through training data in Neural Networks B
- Why does this Keras model require over 6GB of memo
- How to measure overfitting when train and validati
- Create image of Neural Network structure
- Neural Network – Predicting Values of Multiple Var
- How to convert deep learning gradient descent equa
- keras fit with y=None with embedding layer
After you setup
optimizer
to themodel
, each parameter oflink
in the model hasupdate_rule
attribute (e.g.AdaGradRule
in this case), which defines how to update this parameter.And each
update_rule
hashyperparam
attribute separately, so you can overwrite thesehyperparam
for each parameter in the link.Below is a sample code,