Inferring multiple sentence per request

Dear OpenNMT community,

I’m using OpenNMT-tf and test my model using onmt infer script.
I find that even if there are several sentences in one line in feature file it will give me one sentence as the result.
Is it because my model is not trained long enough? Or my train data not big enough? Limitation of NMT algorithm or behaviour of OpenNMT-tf?

I do appreciate your help,


If you train your model on single sentences, you should also feed single sentences at test time. OpenNMT-tf does not split sentences for you.

1 Like

Thanks @guillaumekln.
OpenNMT FAQ( part What do I need to train an NMT model?) tells us that train file can only one sentence per line.
So is it correct if I say that OpenNMT can train multiple sentence and there is no limitation of one sentence per line? Or it is just OpenNMT-tf that currently can do it?

More generally, a line in the file is a training unit : it’s simply a list of symbols. The toolkit has no concept of sentence.

1 Like

Thank you @guillaumekln