guz18
(Göksu g)
April 20, 2022, 11:36am
1
Hi everyone,
I am trying to see the loss graph and epochs by using onmt.utils.loss, however I am new to Deep Learning and there is no guide or tutorial on how to use it. Can you help me regarding this issue? (My research topic is on Grammar Error Correction in Turkish language.)
ymoslem
(Yasmin Moslem)
April 23, 2022, 4:27am
2
Dear Göksu,
OpenNMT reports Perplexity. See this nice explanation by Paul Tardy:
Answer: Interesting question.
First, I did wondered the same question some months ago. Thus, I think that I exactly know the feeling you have, like people in ML/NLP use some variables, apply some transformations (functions) which you can read but...
As for how to calculate epochs, see this answer:
You also need to take into account the number of GPU you use and eventual gradient accumulation (-accum_count) to have your ‘true’ batch size.
E.g. with -batch_size 512, 2 GPUs, -accum_count 3, one step will be 512 * 2 * 3 examples.
You can check it in the log, when it loads the data again, hence starting an epoch, it will log ‘Loading dataset …’. (This may be a bit off because examples are loaded in advance to improve training speed and GPU utilization.)
See also this item in FAQs; although it is for OpenNMT-tf, it applies to OpenNMT-py, too:
https://opennmt.net/OpenNMT-tf/faq.html#how-to-count-the-number-of-epochs
All the best,
Yasmin
1 Like