Trying to run Inference/Translation server

Hello, I am currently trying to run a REST server to do inference and translation, so a user can just send requests as needed, however when I attempt to follow these instructions:

https://opennmt.net/OpenNMT/tools/servers/

I am met with no results because there are no .lua files in the OpenNMT-py that i pulled from the repo. In particular, there is no:

tools/rest_translation_server.lua

Here is where I pull the code from: GitHub - OpenNMT/OpenNMT-py: Open Source Neural Machine Translation in PyTorch

Not sure what i am doing wrong here. If this is a foolish question, forgive me. Pic below is the directory

nosuchfileordirectory

Hi,

This documentation refers to the old OpenNMT/OpenNMT project which is based on LuaTorch.

For OpenNMT-py, please refer to this tutorial:

Yes, I’ve looked at that as well. Do I need to run the server.py file? in order to make requests from the server or is the server good to go once I have exposed the IPs/PORT/configs?

Yes. As stated in point 3 of the tutorial. The server needs to be running in order to listen to requests…

Yes I know it comes off as dumb, but the comment made it seem as if the server.py command may be simply passing the default values to the server without necessarily running it/set it listening. I like to know precisely what everything is doing so that I may accurately document.

Thank you for clarifying

Alright, so I followed the Opennmt-py rest API tutorial and ran into this when i tried to run the server.py command with the appropriate params:

python3 server.py --ip 172.17.0.3 --port 2735 --url_root /translator --config ./available_models/example.conf.json

Traceback (most recent call last):
File “server.py”, line 2, in
from onmt.bin.server import main
File “/root/opennmt/OpenNMT-py-translate-server/OpenNMT-py/onmt/init.py”, line 2, in
import onmt.inputters
File “/root/opennmt/OpenNMT-py-translate-server/OpenNMT-py/onmt/inputters/init.py”, line 6, in
from onmt.inputters.inputter import get_fields, build_vocab, filter_example
File “/root/opennmt/OpenNMT-py-translate-server/OpenNMT-py/onmt/inputters/inputter.py”, line 109
raise ValueError(f"No task specific tokens defined for {data_task}")

I am sorry, i dont quite understand - is server.py simply starting the server up so it may listen for requests? If so, am I missing some package? Otherwise I’m not sure it would be attempting to run some sort of task/job other than setting up a listening server

Your error trace does not seem complete.

Did you update the example.conf.json to use your models/paths?

Also, did you make sure your OpenNMT-py setup was fine, e.g. via trying a simple translation with a pretrained model?