About the Development category (1)
New `hook` mechanism (16)
Typo on master? (1)
What's "OpenNMT-py/onmt/modules/"? (1)
How do you use '-replace_unk' on ( (1)
Multiple tokens in Source to single token in Target (10)
Why Lua/Torch? (Please don't hate me for this question.) (9)
What's the use of `coverage` in the forward pass for GlobalAttention? (1)
Could I submit pull request to tokenization hook for Korean and Japanese? (4)
Language Model scorer and sampler (17)
Why exclude last target from inputs? (2)
OpenNMT-py Using Multiple Encoders (2)
Approach to translate one sentence at a time by write one sentence into temp file (3)
HDFS support in OpenNMT-tf (2)
How to exclude number and URL from vocabulary in translation? (4)
Windows + CUDA working with PyTorch! (2)
Simple OpenNMT-py REST server (1)
Choosing number of epochs for a stacked encoder decoder model (5)
How computing loss in shards helps reduce memory cost? (1)
SentencePiece vs. BPE (1)
Custom Loss Function Criterion (5)
How to ensemble some models by OpenNMT(pytorch)? (2)
How should I choose parameters? (2)
Corpus level TER averaging (3)
How ensemble decoding (4)
OpenNMT : lua code debug (2)
Need help understanding copy_attn_force (3)
BPE options handling in learn_bpe.lua and tokenizer.lua (4)
OpenNMT tagger (CUDA -> CPU) release model (3)
Changing the behaviour of `end_epoch` options when used in combination with `train_from` and `continue` options (5)