Reversing the input for seq2seq?

I’ve seen in the TensorFlow seq2seq tutorial that the encoding sequence of the first RNN is reversed (e.g., so that “Hi there END” becomes “END there hi”). I’ve also read that, in seq2seq models, the end of the input sequences naturally are given more weight.

Does OpenNMT have the same bias / already reverse the input? Or should I reverse the input sequence manually to make sure that the end of the sequence is given more weight?

This shouldn’t matter in the OpenNMT model because it uses attention.

If you wanted to test it though you could just write a simple script to reverse your source files before calling preprocess.lua.