Long sentences issue in onmt_server

I guess that my translation model do not process long sentence properly, seeing like a interrupted long sentences or low accuracy of predictions
So, I searched through onmt/bin/translate/* about translation and found arguments max_length of class Translator() in onmt/bin/translate/translator.py
max_length is output token right but default value was 100, my strange case is less then 100,
eventually I couldn’t solve this problem
may be long sentences are not trained well,

in onmt_preprocessing and onmt_train, is there any arguments that needs to be changed?
please some replies…

You probably want to look at src_seq_length and tgt_seq_length, which defaulted to 50 in the legacy version (which you are using since you’re mentioning onmt_preprocessing).

1 Like

It’s so simple solution thank you every time
forgive stupid :frowning: