I am trying to use the pre-trained models from OpenNMT, but the translation quality is very poor
http://opennmt.net/Models-py/
Here is my code
perl tools/tokenizer.perl -a -no-escape -l en -q < sample_sentences.txt > sample_sentences.atok
python translate.py -gpu 0 -model available_models/averaged-10-epoch.pt -src sample_sentences.atok -verbose -output sample_sentences.de.atok
The output German translation for the sentence The cat sat on the mat
is ▁The cat ?
Input: Hello, how are you?
Output: ▁Nein , ▁viel ▁mehr !
Input: How many horses are there in the stable?
Output: ▁Ganz ▁einfach .
I even tried some training sentences from WMT like
I declare resumed the session of the European Parliament adjourned on Friday 17 December 1999, and I would like once again to wish you a happy new year in the hope that you enjoyed a pleasant festive period.
Output: ▁Ganz ▁einfach ▁nur : ▁Das ▁Parlament ▁hat ▁sich ▁in ▁seine m ▁ganz en ▁Haus ▁versteckt .
Please enlighten me where am I wrong. The model claims to have a decent BLEU score of >25