OpenNMT v0.6 release

release

(Guillaume Klein) #1

This release comes with many new features:

  • New deep bidirectional and pyramidal deep bidirectional encoders
  • New attention variants from Luong et al. (2015): dot, concat and general and ability to disable attention completely (only general was available before)
  • New learning rate decay strategy based on experience: only decay when the validation perplexity is not improving more than a threshold
  • New beam search score normalization by length and coverage from Wu et al. (2016)
  • Ability to change the dropout value and fixed word embeddings flags for a retraining
  • and more…

This release also bundles a complete set of the documentation that you can browse online here:

http://opennmt.net/OpenNMT/

You will find more details about these new features in their respective section.

v0.6 also ships the usual fixes and improvements based on user feedback. Thanks to all!


(Etienne Monneret) #2

I’m currently experimenting with this. In previous version, if I use train_from with a new dropout parameter, it’s not taking into account ?


(Guillaume Klein) #3

Nope. The dropout value is saved in the static computation graph.

Now we detect that the value has changed and update accordingly all nn.Dropout modules within the graph.


(Etienne Monneret) #4

Installing this new release, I’m looking at the code to remove the feature shifting. I found a new shift_feature option, in the Features.lua file.But, this new option seems not documented. This option is only in the Features.lua file, and not in other files to change in order to apply the mod you gave in the past.


(Guillaume Klein) #5

Indeed it is still incomplete that is why it is not documented yet. We need to somehow pass this option to the translation to make it complete.