Does OpenNMT-py support explicit training of attention model using alignments?

Hi All,

I am wondering if OpenNMT-py supports explicit training of attention model using word alignments? I couldn’t find anything relevant in the documentation of OpenNMT-py. However, in this page
the guided alignment paper is given as a reference for additional options implementation. So, my question briefly is how to use this additional option of guided alignment in OpenNMT-py?



Unfortunately it does not support this feature. This list is a bit misleading.

However, if you are looking for a project that supports this, take a look at seq2seq-attn.

1 Like