Multiple translations open nmt (2)
Quality estimation / pred_score (4)
Paragraph vs sentence segmentation (3)
What is the best way to generate parallel corpus as per your past experience? (1)
In parallel corpus what should I keep, single sentence in each line or paragraph for MT? (3)
Metrics (Bleu, ppl, gold ppl, pred ....) (8)
About mask in attention layer and decoder (5)
How to create models that translate very quickly (8)
Exposure bias during training (4)
Improve BLEU by Coverage and Context Gate (7)
Use of Monolingual corpora in OPENNMT (8)
Using features for domain/client/subject adaptation (7)
Has anyone experimented with the dense bridge? (2)
Did someone test RNN with larger recurrence? (6)
Training chatbot with multiple inputs (to add context) (5)
Corpus compactization (3)
Incremental vocab (7)
Automatic training corpus filtering (16)
Implementing hashing to massively improve performance (by 95%) (1)
Speech-to-text using Convolutional LSTM layers (2)
Sentence length affect perplexity decrease (3)
Simple combination between SMT and NMT (1)
Alternative methods for <UNK> substitution (6)
Importance Sampling - training speed (2)
Weird Output after many attempts to train (1)
Noise Contrastive Estimation for Machine Translation (5)
In-domain training (15)
Early stopping : a fake solution? (4)
Use for beam search in NMT (8)
New NMT Tutorial from Graham (6)