I have built a multi-feature transformer models . I now have to test it in production.
I went through the Documentation but I am not able to find the procedure to do the serving for multi-feature model.
The documentation only provides help for single input model
I want to know how one can use TensorFlow serving for multi-feature model and what will be the structure of be the content of the config.json file.
I have 3 inputs (1 Source + 2 Features, Let’s Suppose) and One target output.
The Docker-based wrapper does not support additional word features.
Instead, you could use one of the serving option presented here. To know the name of the inputs, you can run the saved_model_cli script to inspect the exported model.