Why do Source tokens/s 1 in every set of iterations?

[06/07/17 10:51:37 INFO] Epoch 1 ; Iteration 50/449663 ; Optim SGD LR 1.0000 ; Source tokens/s 1 ; Perplexity 550610.04
[06/07/17 10:54:13 INFO] Epoch 1 ; Iteration 100/449663 ; Optim SGD LR 1.0000 ; Source tokens/s 1 ; Perplexity 156822.54
[06/07/17 10:57:13 INFO] Epoch 1 ; Iteration 150/449663 ; Optim SGD LR 1.0000 ; Source tokens/s 1 ; Perplexity 31865.05
.
.
Why is it always 1 ?

What are your training options?

Hey ! The default options but Ive turned off shuffle and sort in preprocessing.

Also, I tried modifying the LSTM code a bit.

I’m new to this can you please also let me know the significance of tokens per second ??

Thank you !!

What did you modify exactly?

I think the reporting is wrong on your side but it just tells you how fast you process your training data.

Also if you disabled sorting in the preprocessing, you should use -uneven_batches during the training otherwise you will only get very small batches.