Exposure bias during training
Improve BLEU by Coverage and Context Gate
Use of Monolingual corpora in OPENNMT
Using features for domain/client/subject adaptation
Automatic post-edit training of a training
Has anyone experimented with the dense bridge?
Did someone test RNN with larger recurrence?
Training chatbot with multiple inputs (to add context)
Automatic training corpus filtering
Implementing hashing to massively improve performance (by 95%)
Speech-to-text using Convolutional LSTM layers
Sentence length affect perplexity decrease
Simple combination between SMT and NMT
Alternative methods for <UNK> substitution
Importance Sampling - training speed
Weird Output after many attempts to train
Noise Contrastive Estimation for Machine Translation
Early stopping : a fake solution?
Use for beam search in NMT
New NMT Tutorial from Graham
Differences between different NMT frameworks
Bringing HPC Techniques to Deep Learning - Baidu Research
← previous page
next page →