Bi-directional language model


(LM) #1

Hi,

I have trained my model with English -> Chinese.

It seems like it automatically assumes my source translation content is English. Can I use the same trained model to handle Chinese -> English translation? If yes, how can I do that?

Thanks!


(Eva) #2

Hi Lily,

your English-Chinese model assumes your source translation content is in English
because its source training data was in English.

If you want to handle Chinese-English, you should train a model with Chinese data in the source data and English data in the target data.

Eva


(LM) #3

Thank you Eva.

I thought there may be a way to train the model to be able to handle both directions.


(Eva) #4

NMT models learn from the data you provide to them during training.
If you want them to learn to translate Chinese-English and English-Chinese, you should
provide them Chinese and English data both in source and training data, thus you can obtain a multilingual model.
However, you should introduce a language token in order to help the system to distinguish among languages, you can find a easy-to-follow tutorial here:

good luck!