OpenNMT Forum

Errors when trying to load model exported by OpenNMT-tf


(Asylum) #1

Hello, I tried to build a model using OpenNMT-tf and trained the model with exporters set to “best”.
Since the “best” checkpoint was not saved, I tried to load the .pb file using tensorflow.

Then I ran the script below:

import tensorflow as tf

def load_graph(frozen_graph_filename):
   with tf.gfile.GFile(frozen_graph_filename, “rb”) as f:
      graph_def = tf.GraphDef()
   with tf.Graph().as_default() as graph:
     tf.import_graph_def(graph_def, name=“prefix”)
   return graph

load_graph(the address of saved_model.pb)

But when I ran this script, I got the following Error:

Traceback (most recent call last):
  File “”, line 24, in
  File “”, line 15, in load_graph
google.protobuf.message.DecodeError: Error parsing message

When using the exported model, is there anything else that should be done?

(Guillaume Klein) #2


What do you want to achieve more precisely?

Here’s how the saved model should be loaded:

(Asylum) #3

I wanted to load the exported model (.pb) to do the inference.