I have a question in semantic.
For NMT , if the source and the target share some vocabularies in training, will be helpful or not?
like that, reverse the input sentence:
This is a pen. これ は 一つ ぺん
pen a is this. これ は 一つ ぺん
‘this’ and ‘これ’ have the same meaning and the distance was reduced, so reverse the input sentence is a trick in Google’s paper and make sense.
If the source and the target share some vocabularies, the words in shared vocabularies may reduce the distance in translation ?