Hi,
I’ve been trying to generate question from text(posed as translation problem) using OpenNMT. I’ve tried various architectures(different number of layers, nodes etc), I’ve tried experimenting with LSTM,GRU and I’ve also experimented with other things like the optimizer and glove dimensions. All I get is some fixed output for any input.
Below is some excerpt of the output.
Text: SOS in 2010 , there were concerns among tajik officials that islamic militarism in the east of the country was on the rise following the escape of 25 militants from a tajik prison in august , an ambush that killed 28 tajik soldiers in the rasht valley in september , and another ambush in the valley in october that killed 30 soldiers , followed by fighting outside UNK that left 3 militants dead . EOS
Question: SOS what was the name of the battle in the battle of the battle of the battle ? EOS
Text: SOS as of 2008 -lsb- update -rsb- tertiary education enrollment was 17 % , significantly below the UNK average of 37 % . EOS
Question: SOS how much of the population was the population of 2010 ? EOS
Text: SOS during the 1970s and 1990s , there was an epistemological shift away from the UNK traditions that had largely informed the discipline . EOS
Question: SOS what was the name of the UNK that was used by the early ? EOS
Text: SOS cognitive anthropology seeks to explain patterns of shared knowledge , cultural innovation , and transmission over time and space using the methods and theories of the cognitive sciences -lrb- especially experimental psychology and evolutionary biology -rrb- often through close collaboration with historians , ethnographers , archaeologists , linguists , musicologists and other specialists engaged in the description and interpretation of cultural forms . EOS
Question: SOS what is the name of the theory of science ? EOS
Translation for text out of the corpus:
Text: SOS Oxygen is used in cellular respiration and released by photosynthesis, which uses the energy of sunlight to produce oxygen from water.EOS
Question: SOS in what year did gaddafi die ? EOS
Text: SOS This actually comes in handy when using list comprehensions, or sometimes in return statements, otherwise I’m not sure it helps that much in creating readable code. EOS
Question: SOS in what year did napoleon die ? EOS
I’ve used this configuration to get the above results.
th train.lua -data data/qg-train.t7 -save_model model -rnn_size 400 -layers 3 -optim adam -learning_rate 0.0002 -learning_rate_decay 0.8 -start_decay_at 8 -max_batch_size 64 -dropout 0.3 -start_epoch 1 -end_epoch 15 -max_grad_norm 5 -word_vec_size 300 -pre_word_vecs_enc data/qg-src-emb-embeddings-300.t7 -pre_word_vecs_dec data/qg-tgt-emb-embeddings-300.t7 -fix_word_vecs_enc 1 -fix_word_vecs_dec 1 -gpuid 1 -dbrnn 1 -attention global -rnn_type LSTM -input_feed 1 -global_attention dot
The training set contains around 80 thousand text, question pairs. Can anyone please suggest what might be going wrong. I am trying to replicate the results from the paper Learning to Ask: Neural Question Generation for Reading Comprehension. Any help would be appreciated.
Thanks!!