I have managed to get OpenNMT-tf running on various Windows 10 setups without WSL and without a network connection. The big issue with consumer adoption is the “bloat” of the TensorFlow toolkit which gives an unacceptable start-up delay. I have been thinking about the possibility of doing inference via TensorFlow lite. However, with various releases of TensorFlow 2.3 - .2.5 I have failed to convert a SavedModel with the provided script. I make most progress with TF 2.3 with the fTraceback below. I would be keen to hear if anyone has succeeded converting a SavedModel for TLIte nference.:
Traceback (most recent call last):
File “./tlite_converter.py”, line 8, in
tflite_model = converter.convert()
File “/home/miguel/testenv/lib/python3.5/site-packages/tensorflow/lite/python/lite.py”, line 1076, in convert
return super(TFLiteConverterV2, self).convert()
File “/home/miguel/testenv/lib/python3.5/site-packages/tensorflow/lite/python/lite.py”, line 878, in convert
self._funcs[0], lower_control_flow=False))
File “/home/miguel/testenv/lib/python3.5/site-packages/tensorflow/python/framework/convert_to_constants.py”, line 1103, in convert_variables_to_constants_v2_as_graph
aggressive_inlining=aggressive_inlining)
File “/home/miguel/testenv/lib/python3.5/site-packages/tensorflow/python/framework/convert_to_constants.py”, line 804, in init
self._build_tensor_data()
File “/home/miguel/testenv/lib/python3.5/site-packages/tensorflow/python/framework/convert_to_constants.py”, line 823, in _build_tensor_data
data = val_tensor.numpy()
File “/home/miguel/testenv/lib/python3.5/site-packages/tensorflow/python/framework/ops.py”, line 1063, in numpy
maybe_arr = self._numpy() # pylint: disable=protected-access
File “/home/miguel/testenv/lib/python3.5/site-packages/tensorflow/python/framework/ops.py”, line 1031, in _numpy
six.raise_from(core._status_to_exception(e.code, e.message), None) # pylint: disable=protected-access
File “”, line 3, in raise_from
tensorflow.python.framework.errors_impl.InvalidArgumentError: Cannot convert a Tensor of dtype resource to a NumPy array.
My script for use with TF 2.3 is:
import tensorflow as tf
import tensorflow_addons as tfa
tfa.register_all()
saved_model_dir = input("Enter path of saved model: ")
Convert the model
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir) # path to the SavedModel directory
tflite_model = converter.convert()
Save the model.
with open(‘model.tflite’, ‘wb’) as f:
f.write(tflite_model)