Inference with OpenNMT-tf on "native" Windows 10

I would like to report that I am performing inference with OpenNMT-tf on Windows 10 without using WSL. I use TensorFlow 2.4 and the latest OpenNMT-tf. I got round round the lack of pyonmttok by writing my own tokenizer methods with Sentencepiece. My app is an “on device” system using a modification of the Python client published on the OpenNMT github. This work has been done on an Asus Zenbook without a suitable GPU. I assume I would accomplish the same on a more powerful Windows machine with a GPU.

2 Likes