Learning rate not decaying when perplexity stops decreasing on validation set

I am using the following relevant settings:
-learning_rate 1.0
-learning_rate_decay 0.5
-start_decay_steps 10000

Now, according to the documentation, the decay learning rate will be decayed if (i) perplexity does not decrease on the validation set or (ii) steps have gone past start_decay_steps. Indeed, option (ii) seems to work. I print the validation perplexity every 1000 steps and notice a (sharp) increase, but the learning rate is not decreasing. How do I fix this?

This is not implemented in OpenNMT-py. The doc refers to OpenNMT-Lua

Thanks for your reply Vincent. That explains why I was not able to find anything in the OpenNMT-py code. Confusingly, the options are listed in the OpenNMT-py documentation (see http://opennmt.net/OpenNMT-py/options/train.html).

I will use the OpenNMT-lua, consider my problem solved.

You are correct, I wil fix the documentation.
thanks.

1 Like

They are still there in OpenNMT-py documentation. Are they now supported by pytorch?

right, thanks for reporting, fixed