Multi-Source Translation


(Guillaume Klein) #21

Just added the multi-source Transformer architecture in OpenNMT-tf, starting with “serial” attention layers. Just require to define a Transformer model with parallel inputs: