OpenNMT Forum

Errors when trying to load model exported by OpenNMT-tf

opennmt-tf

(Asylum) #1

Hello, I tried to build a model using OpenNMT-tf and trained the model with exporters set to “best”.
Since the “best” checkpoint was not saved, I tried to load the .pb file using tensorflow.

Then I ran the script below:

import tensorflow as tf


def load_graph(frozen_graph_filename):
   with tf.gfile.GFile(frozen_graph_filename, “rb”) as f:
      graph_def = tf.GraphDef()
     proto_b=f.read()
     graph_def.ParseFromString(proto_b)
   with tf.Graph().as_default() as graph:
     tf.import_graph_def(graph_def, name=“prefix”)
   return graph


load_graph(the address of saved_model.pb)

But when I ran this script, I got the following Error:

Traceback (most recent call last):
  File “test.py”, line 24, in
   load_graph("./best/1551018311/saved_model.pb")
  File “test.py”, line 15, in load_graph
   graph_def.ParseFromString(proto_b)
google.protobuf.message.DecodeError: Error parsing message

When using the exported model, is there anything else that should be done?


(Guillaume Klein) #2

Hi,

What do you want to achieve more precisely?

Here’s how the saved model should be loaded:

https://www.tensorflow.org/guide/saved_model#loading_a_savedmodel_in_python


(Asylum) #3

I wanted to load the exported model (.pb) to do the inference.