Hi, everyone. I wonder if someone did some comparison of both frameworks, OpenNMT torch and OpenNMT-tf. I did some experiments and results are surprising, because I get about 20 points of difference between them. The task is about grammatical error corrections so I basically check if corrected sentence (hypothesis) is the same as target sentence. My metric is the % of identic sentences over a total number of sentences. I get a much better score in torch using BPE, and the following configuration:
-tgt_word_vec_size 128 \
than using OpenNMT-tf with BPE (I tried different models as NMT-medium, transformer with their default parameters, only varying updates number).
Has anyone done similar experiments comparing both backends?