Is coverage loss supported?

I see that one of the train options is to use coverage loss (lambda_coverage) based on the See et al (2017) paper. However, I don’t see where this is implemented and I don’t see any changes in my training when I use this option. I am using the transformer model.
Is it currently supported?

@pltrdy Could you comment on that? Thanks.

My guess is that it is not supported (I haven’t located the use of lambda_coverage in the code).
I ended up using the coverage_penalty in the translate function (which I am not sure if it is supposed to do the same) as well as the block_ngram_repeat option but they were not helpful for my data.

Thank you anyway!

Hi @elen, sorry for the late reply (more used to answer to github issues tho).

The coverage loss is actually implemented, see e.g. https://github.com/OpenNMT/OpenNMT-py/blob/master/onmt/utils/loss.py#L305 which is suppose to implement See (2017) covloss, i.e. Capture d’écran de 2019-12-12 12-21-21

My bad, I was using a previous version. Thank you very much.