Some experience when training with large datasets
Bpe, vocab size
OpenNMT-py de-en Train Valid Perplexity
How to package API
What's wrong with my code?
Bpe option in learn_bpe.lua
Using trained OpenNMT models for scoring translations
Restore config from checkpoint
Translate.py failing with TypeError
Issue on Dictionary Formation and translation processes
Evaluate translation hypotheses
Training with two GPU is slower than using only one
Vocabs in preprocess.py
Perplexity nan at epoch 5
Paraphrase generation with OpenNMT
XML tags handling
Finding perplexity score for existing model
Stop to output log in training when set encoder_type to brnn
Shared vocabulary for summarization problem
Tok_(src|tgt)_case_feature + DynData training, translate <unk> rather than L or N
Out Of Memory with Dynamic Dataset, LuaJit, preprocess_nthreads > 1
When does the model transform string to int during training?
Why value of list(train_iter).src in train.py always changing?
Tok_src_case_feature : how to do with translation_server?
Can you please suggest me any paper/video which explain how OpenNMT approach work, it calculates accuracy and other parameters?
Problem in translate.py
The epoch can not be finished when training from checkpoint model twice
Training takes up 100% RAM and Swap CPU Memory
Chinese to english translation many unknown words
← previous page
next page →