Question about beam search coverage penalty in opennmt-tf

I have done some test on opennmt-tf and see that the beam search generation tend to generate very long and repetitive translations when coverage penalty is turned on. When I turned it off it was fine. So I guess there may be some bug in the coverage penalty code maybe? Just a guess.

Due to reasons I can’t provide code and generation samples for my finding, sorry about this. :frowning:

Opennmt-tf version is 1.25.3

What is the coverage penalty value that you used?

I can recheck the implementation but it was inspired by the beam search code in TensorFlow which is most likely correct.

EDIT: On second thoughts, are you using a Transformer model that was not trained with guided alignment? If yes, coverage penalty will not work with this configuration.

I did not use guided alignment. This must be the cause, thank you!