OpenNMT Forum

Simple OpenNMT-py REST server

Is there a way to use the tokenizer/detokenizer in the tools directory during translation?

How would that be configured in conf.json? Here is the default settings:

"tokenizer": {
    "type": "sentencepiece",
    "model": "wmtenfr.model"

Which tokenizer are you refering to?

The server supports the type sentencepice and pyonmttok. The later refers to the OpenNMT tokenizer:

I suppose @stevebpdx meant Moses’ tokenizer.perl from OpenNMT-py’s tools directory.

OK, so pyonmttok is the OpenNMT out of the box tokenizer.

What’s the difference between sentencepiece and pyonmttok? Can you point me to some documentation?

It can apply SentencePiece and more. See some documentation here:

A post was merged into an existing topic: German to English Translation Server

I have some Question.
I used only the Tokenizer and BPE provided by openNMT Tools as preprocessing.
How can I apply the Tokenizer provised by tools and BPE to

How do I change the example below?
“tokenizer”: {
“type”: “sentencepiece”,
“model”: “wmtenfr.model”

when I use the command below
python3 --ip --port 5000 --url_root “/translator” --config “./available_models/conf.json”

This error is occur
TypeError: unorderable types: list() < int()

What should I do.

“models_root”: “./available_models”,
“models”: [
“id”: 1000,
“model”: “”,
“timeout”: 600,
“on_timeout”: “to_cpu”,
“load”: true,
“opt”: {
“gpu”: 0,
“beam_size”: 5
“tokenizer”: {
“type”: “pyonmttok”,
“model”: “src.code”

Should be something like:

"tokenizer": {
  "type": "pyonmttok",
  "mode": "conservative",
  "params": {

where params can be any arguments from here:

I have some Question about

python3 --ip --port 5000 --url_root “./translator” --config “./available_models/conf.json”

Pre-loading model 100
[2019-06-03 11:17:06,458 INFO] Loading model 100
Traceback (most recent call last):
File “”, line 123, in
File “”, line 24, in start
File “/data/home/chanjun_park/work/OpenNMT-py/onmt/translate/”, line 102, in start
self.preload_model(opt, model_id=model_id, **kwargs)
File “/data/home/chanjun_park/work/OpenNMT-py/onmt/translate/”, line 140, in preload_model
model = ServerModel(opt, model_id, **model_kwargs)
File “/data/home/chanjun_park/work/OpenNMT-py/onmt/translate/”, line 227, in init
File “/data/home/chanjun_park/work/OpenNMT-py/onmt/translate/”, line 282, in load
os.devnull, “w”, “utf-8”))
File “/data/home/chanjun_park/work/OpenNMT-py/onmt/translate/”, line 28, in build_translator
fields, model, model_opt = load_test_model(opt)
File “/data/home/chanjun_park/work/OpenNMT-py/onmt/”, line 99, in load_test_model
File “/data/home/chanjun_park/work/OpenNMT-py/onmt/”, line 128, in build_base_model
src_emb = build_embeddings(model_opt, src_field)
File “/data/home/chanjun_park/work/OpenNMT-py/onmt/”, line 53, in build_embeddings
File “/data/home/chanjun_park/work/OpenNMT-py/onmt/modules/”, line 167, in init
pe = PositionalEncoding(dropout, self.embedding_size)
File “/data/home/chanjun_park/work/OpenNMT-py/onmt/modules/”, line 35, in init
self.dropout = nn.Dropout(p=dropout)
File “/data/home/chanjun_park/.local/lib/python3.5/site-packages/torch/nn/modules/”, line 11, in init
if p < 0 or p > 1:
TypeError: unorderable types: list() < int()

I don’t know why this is happening.
Is there any solution???

@francoishernandez Any idea?

opennmt-py server tps is low,how can i do?

I hope someone can help me .thanks.

We changed the dropout option from int to list.

I think your model has been trained with the new list thing but you must have exposed your model on a that has notbeed updated with master.

git pull on your server and let me know.

1 Like

Thank you so much.
Problem solved.

32 posts were split to a new topic: Issues running the OpenNMT-py REST server

This topic was automatically opened after 21 hours.

hello @pltrdy ,
while running the server i got the above error. Your help in solving the issue will be appriciated. Thanks.

Hello @park, while starting the server i get the following error, yet my available_models directory is in the path: ng@ng:~OpenNMT-py/ please help, thanks.
python3 --ip --port 5000 --url_root “./translator” --config “./available_models/conf.json”
Traceback (most recent call last):
File “”, line 129, in
File “”, line 24, in start
File “/home/ng/OpenNMT-py/onmt/translate/”, line 80, in start
with open(self.config_file) as f:
FileNotFoundError: [Errno 2] No such file or directory: ‘“./available_models/conf.json”’

I think i just figured out where the issue might be; i had not input my trained model. Working on that.