OpenNMT Forum

Learning rate not decaying when perplexity stops decreasing on validation set

opennmt-py

(David Stap) #1

I am using the following relevant settings:
-learning_rate 1.0
-learning_rate_decay 0.5
-start_decay_steps 10000

Now, according to the documentation, the decay learning rate will be decayed if (i) perplexity does not decrease on the validation set or (ii) steps have gone past start_decay_steps. Indeed, option (ii) seems to work. I print the validation perplexity every 1000 steps and notice a (sharp) increase, but the learning rate is not decreasing. How do I fix this?


(Vincent Nguyen) #2

This is not implemented in OpenNMT-py. The doc refers to OpenNMT-Lua


(David Stap) #3

Thanks for your reply Vincent. That explains why I was not able to find anything in the OpenNMT-py code. Confusingly, the options are listed in the OpenNMT-py documentation (see http://opennmt.net/OpenNMT-py/options/train.html).

I will use the OpenNMT-lua, consider my problem solved.


(Vincent Nguyen) #4

You are correct, I wil fix the documentation.
thanks.


(Jeff Wang) #5

They are still there in OpenNMT-py documentation. Are they now supported by pytorch?


(Vincent Nguyen) #6

right, thanks for reporting, fixed