OpenNMT Forum

How to make Serving for Multi-Feature Transformer Model?

Dear Fellow Researchers,

I have built a multi-feature transformer models . I now have to test it in production.
I went through the Documentation but I am not able to find the procedure to do the serving for multi-feature model.

The documentation only provides help for single input model

I want to know how one can use TensorFlow serving for multi-feature model and what will be the structure of be the content of the config.json file.

I have 3 inputs (1 Source + 2 Features, Let’s Suppose) and One target output.

Little help or a lead would be really helpful.

Thanks & Regards

Hi,

The Docker-based wrapper does not support additional word features.

Instead, you could use one of the serving option presented here. To know the name of the inputs, you can run the saved_model_cli script to inspect the exported model.

I am getting this as an output :

`

The given SavedModel SignatureDef contains the following input(s):
inputs[‘length_0’] tensor_info:
dtype: DT_INT32
shape: (-1)
name: serving_default_length_0:0
inputs[‘length_1’] tensor_info:
dtype: DT_INT32
shape: (-1)
name: serving_default_length_1:0
inputs[‘length_2’] tensor_info:
dtype: DT_INT32
shape: (-1)
name: serving_default_length_2:0
inputs[‘tokens_0’] tensor_info:
dtype: DT_STRING
shape: (-1, -1)
name: serving_default_tokens_0:0
inputs[‘tokens_1’] tensor_info:
dtype: DT_STRING
shape: (-1, -1)
name: serving_default_tokens_1:0
inputs[‘tokens_2’] tensor_info:
dtype: DT_STRING
shape: (-1, -1)
name: serving_default_tokens_2:0
The given SavedModel SignatureDef contains the following output(s):
outputs[‘alignment’] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1, -1, -1)
name: StatefulPartitionedCall_8:0
outputs[‘length’] tensor_info:
dtype: DT_INT32
shape: (-1, 1)
name: StatefulPartitionedCall_8:1
outputs[‘log_probs’] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1)
name: StatefulPartitionedCall_8:2
outputs[‘tokens’] tensor_info:
dtype: DT_STRING
shape: (-1, 1, -1)
name: StatefulPartitionedCall_8:3
Method name is: tensorflow/serving/predict
(tf_v2_env)

`

My Command was:

saved_model_cli show --dir data/export --tag_set serve --signature_def serving_default