Is there a way to measure inference speed?

tensorflow

(Tony Reina) #1

I’d like to use openNMT to benchmark inference speed on NMT models. My thought was to use the English to German corpus. Typically I’ve seen the inference speed given as the number of sentences translated per second. Is there a switch in openNMT to show this value in the log? Or do I have to measure that in some other way?

Thanks.
-Tony


(Guillaume Klein) #2

This feature is coming in the next version (see the changelog). It’s a flag that simply logs several speed metrics at the end of the inference.

The next version should be out in the next few days but if you want the feature now, you can clone the repository and run the scripts directly.