OpenNMT torch vs OpenNMT-tf

Hi, everyone. I wonder if someone did some comparison of both frameworks, OpenNMT torch and OpenNMT-tf. I did some experiments and results are surprising, because I get about 20 points of difference between them. The task is about grammatical error corrections so I basically check if corrected sentence (hypothesis) is the same as target sentence. My metric is the % of identic sentences over a total number of sentences. I get a much better score in torch using BPE, and the following configuration:
-rnn_size 128
-encoder_type rnn
-rnn_type LSTM
-end_epoch 60
-max_batch_size 50
-save_model ${folder}/models/
-layers 2
-dropout 0.3
-optim adam
-learning_rate 0.0002
-learning_rate_decay 1.0
-src_word_vec_size 128
-tgt_word_vec_size 128 \

than using OpenNMT-tf with BPE (I tried different models as NMT-medium, transformer with their default parameters, only varying updates number).

Has anyone done similar experiments comparing both backends?

Hi Suzanna, I’m planning to do “scientific” comparisons like you but at present am doing human evaluations as I know the two languages involved. What I can see is that OpenNMT-tf is producing much more natural-sounding, fluent translations than the Torch (Lua) based toolkit which I have been using for production. When I do a scientific comparison I will post the results as this topic interests me.

1 Like

Hello Terence & Suzana,

Just to let you know that I am also conducting comparisons between OpenNMT LuaTorch and OpenNMT-tf with SP and Transformer model; We are currently still using the former for production and the latter for experiments with the aim to deploy it in production if it performs better.

What I see so far is an increase of 6+ BLEU-detok points with the language pair de-fr. Being at an early stage of human evaluation of the output, I can see a more correct use of terminology so far…

Btw, the main reason for preferring OpenNMT-tf to OpenNMT-py is that the former supports vocabulary update and replacements, for injecting new domain-specific tokens in the vocabulary, as also explained by Guillaume in Choosing between OpenNMT, OpenNMT-py, and OpenNMT-tf for Domain Adaptation which is useful for our use case.

Let’s share our experiences as we have common interest topics!

Hi all,
OpenNMT-Lua is deprecated (not maintained) and main point: there is no transformer implementation.
It will definitely give you worse results compared to the -tf anf -py transformer models.

Hi Anna, Sorry for not responding earlier. I am about to start training a new Turkish-English model with OpenNMT-tf + SP + Transformer. I will then be able to do some comparisons with my earlier Lua Torch Turkish-English model (currently on nmtgateway.com) and communicate some results.