How to reset the learning rate?

As I have trained several models, each time I load one model and it’s optimizer, the parameters in args don’t change the saved optimizer, how do I reset the learning rate?
Another question: the perplexity stubbed in around 60, the learning rate decay with 0.5 is become extremely small after several epochs, how to automatically tune or hand tune the learning rate is very hard for me, may someone give me some advices.

See options in the documentation here:
http://opennmt.net/OpenNMT/options/train/#optimization-options
:wink:

1 Like

The PyTorch version has limited support for retraining. If you are just using the project, I recommend you to use the Lua version which has more documented features.

Thank you, I’ll have a look about the lua version