Inference config not working when tensorflow serving with nmt-wizard-docker

Export model with config

onmt-main export --export_dir_base /data/models --config config/test.yml

test.yml

infer:
  n_best: 3
  with_scores: true
  with_alignments: hard

Serving with nmt-wizard-docker

docker run -p 5000:5000 -v $PWD:/root/models nmtwizard/opennmt-tf --model 20190625 --model_storage /root/models serve \
  --host 0.0.0.0 --port 5000

config.json

{
    "source": "en",
    "target": "ko",
    "model": "20190625",
    "modelType": "release",
    "tokenization": {
        "source": {
            "mode": "none",
            "vocabulary": "${MODEL_DIR}/1/assets/clean_train.en.tok.150k.vocab"
        },
        "target": {
            "mode": "none",
            "vocabulary": "${MODEL_DIR}/1/assets/clean_train.ko.tok.150k.vocab"
        }
    },
    "options": {
        "config": {
            "infer": {
                "n_best": 3,
                "with_scores": true,
                "with_alignments": "hard"
            },
            "score": {
                "with_alignments": "hard"
            }
        }
    }
}

Results

But there is no src, tgt align info in the request results.
Is there any idea about this problem?

Thank you!

Alignment vectors are currently not returned by this wrapper. This will be added in the future.

1 Like

Thanks for your answer!
Is there any other way to show Alignment vectors?

  1. Using Opennmt-py version
  • is it support Alignment vectors?
  1. Using tensorflow-serving
  • in this case, I have to implement client code for support Alignment vectors?
  1. The OpenNMT-py server does not return alignment vectors.
  2. Yes, alignment vectors are part of the model outputs and so are returned by TensorFlow Serving
1 Like