Question about the implementation of copy mechanism

pytorch

(Rylanchiu) #1

In the decoding, there is a collapse_copy_score (here) thing to add copy and generation probability. But in the training period, I can’t find such a thing. Seems that the copy and generate probability are concatenated and then no post processing is done. Could you please tell me why?


(Rylanchiu) #2

Seems I misunderstood the usage of collapse_copy_scores. It is to shift the copy probability from src_vocab to tgt_vocab. It is only useful when the src and tgt has different vocabs. But doesn’t OpenNMT requires share_vocab when copy_attn is enabled? I am quite confused now…