Tranformer: change dropout when finetuning

Hi,

I need to change the dropout when finetuning a base Transformer model. When training from a checkpoint, it uses the dropout from the checkpoint. Is there a way I can force the dropout to be changed?

Thanks

I considered adding the following lines in train_single.py:

# Override checkpoint's droupout
model_opt.dropout = opt.dropout
model_opt.attention_dropout = opt.attention_dropout

This seems like a good idea, but I wonder if we should add as well some sort of flag e.g. --override-opts to avoid changing some opts inadvertently.

Definitely

Does OpenNMT-py support fine tuning?