Opts.py files for the published papers

(Zhenfeng Cao) #1

I’m going to test my model against the previously published ones (to name a few, RNNSearch[1], Transformer[2], Google’s NMT[3]). So, for the first step, I may have to reproduce their results as precise as possible.

Suppose I’m going to reproduce the RNNSearch, while the tokenized data is directly available here [4] , what is left may be just tuning the parameters in the opts.py file. I’m wondering if there are any officially (by openmnt) issued opts.py files for those published works.

If the answer is no, I think it would be helpful for our research if we can create a repository for this. Can anyone tell me if this has already been done? (Otherwise, how about we create one together?)

[1] Bahdanau, Dzmitry, Kyunghyun Cho, and Yoshua Bengio. “Neural machine translation by jointly learning to align and translate.” arXiv preprint arXiv:1409.0473 (2014).
[2] Wu, Yonghui, et al. “Google’s neural machine translation system: Bridging the gap between human and machine translation.” arXiv preprint arXiv:1609.08144 (2016).
[3] Vaswani, Ashish, et al. “Attention is all you need.” Advances in Neural Information Processing Systems . 2017.
[4] http://www-lium.univ-lemans.fr/~schwenk/cslm_joint_paper/

(Vincent Nguyen) #2

none of them is exactly reproducible with onmt-py as is.

read the FAQ of the doc of onmt-py you will see a config for the transformer which will give you results close to “attention is all you need”

(Zhenfeng Cao) #3

Thanks. Will try.