Multi way training

I am working using two language lets name them language A and B. I wanted to create a single model that can accept any of the languages either A and B and give out a corresponding transalted output. ie if my input is language A then the output should language B and vice versa should also be true using pytorch.

I have read about the ROMANCE but couldnt understand since a kinda new in machine translation. Can someone please help me.

Hello, you can just mix A-B and B-A sentences and it should work reasonably well - your model will both learn to detect source language and to translate to the other language. However, if your languages A and B are similar, then you can expect confusion between the languages - so the better is to add a marker in the source (so in A-B sentence, just add target_B token somewhere for instance as the first token, and the other way around for the other direction),

1 Like

thanks for the reply …

but am kinda new in neural networks… would you kindly prepare me some pdf with some guidelines indicating how to achieve the whole process of multilingual translation…

i`ve tried to check on romance but i get more confused…

thanks in advance.

@jean.senellart is it very essential to perform bpe tokenization when trying to work on bi-directional machine translation?