How do I use ConvS2S, RNNattention in OpenNMT-py?

so now I now for transformer we have to set `–encoder_type transformer --decoder_type transformer’ .

what about ConvS2S and RNNattention? Are they ‘–encoder_type cnn --decover_type cnn’ and ‘–encoder_type rnn --decoder_type rnn’ respectively?

also for RNNattention, is attention model default or is there option to add attention for rnn?

Thanks!

For ConvS2S, you can use the following command with your own dataset:
python train.py -data iwslt14.tokenized.de-en/germanToEnglish -save_model cnn-model -encoder_type cnn -decoder_type cnn -world_size 1 -gpu_ranks 0 -report_every 4 -batch_size 16 -dropout 0.1 -learning_rate 0.001 -max_generator_batches 16 -valid_batch_size 16 -train_steps 2000000 -enc_layers 5 -dec_layers 5 -src_word_vec_size 512 -tgt_word_vec_size 512 -rnn_size 512 -optim adam -log_file log_20_nov.txt -reset_optim keep_states -learning_rate_decay 0.99

Regarding RNN+attn, you can run the command in quickstart:
python train.py -data data/demo -save_model demo-model

Ah okay, it looks like Luong attention. Thanks!

btw why do I need -rnn_size option for convs2s?