Export saved_model with tflite mode


May I ask if I could export a transformer_big model to savedModel format, with tflite mode enabled?

The example code is like this:

model = opennmt.load_model('model_ckpt')

    "source_vocabulary": source_vocab,
    "target_vocabulary": target_vocab,

with model.enable_tflite_mode():
    tf.saved_model.save(model, "./temp/", signatures=model.serve_function())

The goal is to limited the tensorflow API calls to the tflite subset, because we don’t have the full tensorflow API support on a specific hardware platform.

Will it in theory work?

In practice, the code above worked but when I load the savedModel I got the following errors:

ValueError: indices.shape[-1] must be <= params.rank, but saw indices shape: [1,1] and params shape: [] for '{{node transformer_big/while/word_embedder_1/GatherNd}} = ResourceGatherNd[Tindices=DT_INT32, _output_shapes=[[1,1024]], dtype=DT_FLOAT](transformer_big_while_word_embedder_1_gathernd_resource_0:0, transformer_big/while/word_embedder_1/ExpandDims:0)' with input shapes: [], [1,1].

Not quite sure if we could fix this.

Thank you in advance for your help.