PRED No words predicted


I trained two models with the most recent version of OpenNMT-py - EN-DE and EN-ES. But when I translate, the result is always an empty string. I get the following error message:

“PRED No words predicted”.

I saw this issue, but it’s not answered with a solution.

Anybody has any idea?

I trained on a TitanX GPU. Used adam, and others are default parameters for the training; for translation - default parameters as well.

Thanks in advance.
Kind regards,

You need to give more context please.

Give us the training command line the end of the log and the translate command line.

confirm your pytorch, python version as well as the omnt-py version (last one possibly)

I am using anaconda virtual environment.


Python: 3.6.8
opennmt-py: 0.7.0
pytorch: 1.0.0

Training command:

echo "4.1. Prepare the data..."
python $OPENNMT/ \
    -train_src $DATADIR/ \
    -train_tgt $DATADIR/ \
    -src_vocab $DATADIR/ \
    -tgt_vocab $DATADIR/ \
    -valid_src $DATADIR/ \
    -valid_tgt $DATADIR/ \
    -save_data $DATADIR/ready_to_train

echo "4.2. Train..."
python $OPENNMT/ \
    -data $DATADIR/ready_to_train \
    -gpu_ranks 0 \
    -optim adam \
    -log_file $MODELDIR/train.log \
    -valid_batch_size 12 \
    -save_model $MODELDIR/model

Translation command:

python $OPENNMT/ \
    -model $MODELDIR/ \
    -src $INPUT \
    -output $INPUT.out \
    -gpu $DEVICEID

can you share the end of the train.log file ?

I overwrote that in a consecutive test, so I cannot share the log, sorry.

Regardless, after the training finished, I got a validation perplexity and accuracy (the accuracy on EN-DE was 91.4) and the models were saved.


try to use the -fast flag as well as the -beam_size 5 flag in the command line

OK, I will do so and report on results.