OpenNMT Inference System

Hi,
This is my first post. And I like OpenNMT-TF. Thanks for it. I have two questions about OpenNMT.

1 - I am writing inference system and I want to use string text instead of file.
Input code;
input_fn = estimator_util.make_input_fn(self._model,tf.estimator.ModeKeys.PREDICT,self._config["infer"]["batch_size"],features_file=features_file,bucket_width=self._config["infer"]["bucket_width"],num_threads=self._config["infer"].get("num_threads"),prefetch_buffer_size=self._config["infer"].get("prefetch_buffer_size"),return_dataset=False)

2- I want to use savedModel.pb instead of checkpoint. Do you have sample code? I don’t want to use nmt-wizard.

Thanks for the answers.

Hello,

Here is an example, assuming export_dir points to a SavedModel directory:

with tf.Session() as sess:
    meta_graph_def = tf.saved_model.loader.load(
        sess, [tf.saved_model.tag_constants.SERVING], export_dir)
    signature_def = meta_graph_def.signature_def["serving_default"]

    input_tokens = signature_def.inputs["tokens"].name
    input_length = signature_def.inputs["length"].name
    output_tokens = signature_def.outputs["tokens"].name
    output_length = signature_def.outputs["length"].name

    inputs = {
        input_tokens: [
            ["Hello", "world", "!", ""],
            ["How", "are", "you", "?"]],
        input_length: [3, 4]
    }

    batch_tokens, batch_length = sess.run(
        [output_tokens, output_length], feed_dict=inputs)

    for tokens, length in zip(batch_tokens, batch_length):
        tokens, length = tokens[0], length[0]  # Take the best hypothesis.
        length -= 1  # Ignore </s> token.
        print(tokens[:length])

Don’t forget that the model inputs and outputs are tokenized.

Thank you for your answer. I gave this error when I ran the sample code.

meta_graph_def = tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], “***/run/export/latest/1558388972”)
File “***\Python\Python36\lib\site-packages\tensorflow\python\util\deprecation.py”, line 324, in new_func
return func(args, kwargs)
File "
\Python\Python36\lib\site-packages\tensorflow\python\saved_model\loader_impl.py", line 269, in load
return loader.load(sess, tags, import_scope, saver_kwargs)
File "
*\Python\Python36\lib\site-packages\tensorflow\python\saved_model\loader_impl.py", line 420, in load
saver_kwargs)
File "
*\Python\Python36\lib\site-packages\tensorflow\python\saved_model\loader_impl.py", line 350, in load_graph
meta_graph_def, import_scope=import_scope, saver_kwargs)
File "
*\Python\Python36\lib\site-packages\tensorflow\python\training\saver.py", line 1457, in _import_meta_graph_with_return_elements
kwargs))
File "
*\Python\Python36\lib\site-packages\tensorflow\python\framework\meta_graph.py", line 806, in import_scoped_meta_graph_with_return_elements
return_elements=return_elements)
File “***\Python\Python36\lib\site-packages\tensorflow\python\util\deprecation.py”, line 507, in new_func
return func(args, kwargs)
File "
\Python\Python36\lib\site-packages\tensorflow\python\framework\importer.py", line 399, in import_graph_def
_RemoveDefaultAttrs(op_dict, producer_op_list, graph_def)
File “***\Python\Python36\lib\site-packages\tensorflow\python\framework\importer.py”, line 159, in _RemoveDefaultAttrs
op_def = op_dict[node.op]
KeyError: ‘GatherTree’

What TensorFlow version are you using?

tb-nightly 1.14.0a20190301
tensorboard 1.13.1
tensorflow 1.13.1
tensorflow-datasets 1.0.2
tensorflow-estimator 1.13.0
tensorflow-estimator-2.0-preview 1.14.0.dev2019051700
tensorflow-gpu 1.13.1
tensorflow-metadata 0.13.0
tensorflow-serving-api 1.13.0
tf-estimator-nightly 1.14.0.dev2019030115
tf-nightly-gpu-2.0-preview 2.0.0.dev20190517

Can you clean up your installation? You have TensorFlow CPU, GPU, 1.13. 2.0, …

I tested the code above on 1.13 and it works.

tensorboard 1.13.1
tensorflow 1.13.1
tensorflow-estimator 1.13.0
tensorflow-gpu 1.13.1

I clean up my installation but I have same error. I send all error you.

2019-05-21 15:45:37.174863: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2
WARNING:tensorflow:From f:\Projects\1558393749\test.py:4: load (from tensorflow.python.saved_model.loader_impl) is deprecated and will be removed in a future version.
Instructions for updating:
This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.loader.load or tf.compat.v1.saved_model.load. There will be a new function for importing SavedModels in Tensorflow 2.0.
Traceback (most recent call last):
File “***\pythonFiles\ptvsd_launcher.py”, line 43, in
main(ptvsdArgs)
File “***\pythonFiles\lib\python\ptvsd_main_.py”, line 410, in main
run()
File “***\pythonFiles\lib\python\ptvsd_main_.py”, line 291, in run_file
runpy.run_path(target, run_name=‘main’)
File “***\Python\Python36\lib\runpy.py”, line 263, in run_path
pkg_name=pkg_name, script_name=fname)
File “***\Python\Python36\lib\runpy.py”, line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File “***\Python\Python36\lib\runpy.py”, line 85, in _run_code
exec(code, run_globals)
File “f:\Projects\1558393749\test.py”, line 4, in
meta_graph_def = tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], export_dir = “F:/Projects/1558393749”)
File ***\Python\Python36\lib\site-packages\tensorflow\python\util\deprecation.py", line 324, in new_func
return func(args, kwargs)
File "
\Python\Python36\lib\site-packages\tensorflow\python\saved_model\loader_impl.py", line 269, in load
return loader.load(sess, tags, import_scope, saver_kwargs)
File "
*\Python\Python36\lib\site-packages\tensorflow\python\saved_model\loader_impl.py", line 420, in load
saver_kwargs)
File "
*\Python\Python36\lib\site-packages\tensorflow\python\saved_model\loader_impl.py", line 350, in load_graph
meta_graph_def, import_scope=import_scope, saver_kwargs)
File "
*\Python\Python36\lib\site-packages\tensorflow\python\training\saver.py", line 1457, in _import_meta_graph_with_return_elements
kwargs))
File "
*\Python\Python36\lib\site-packages\tensorflow\python\framework\meta_graph.py", line 806, in import_scoped_meta_graph_with_return_elements
return_elements=return_elements)
File “***\Python\Python36\lib\site-packages\tensorflow\python\util\deprecation.py”, line 507, in new_func
return func(args, kwargs)
File "
\Python\Python36\lib\site-packages\tensorflow\python\framework\importer.py", line 399, in import_graph_def
_RemoveDefaultAttrs(op_dict, producer_op_list, graph_def)
File “***\Python\Python36\lib\site-packages\tensorflow\python\framework\importer.py”, line 159, in _RemoveDefaultAttrs
op_def = op_dict[node.op]
KeyError: ‘GatherTree’

My Code :

import tensorflow as tf

with tf.Session() as sess:
    meta_graph_def = tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], export_dir = "F:/Projects/1558393749")
    signature_def = meta_graph_def.signature_def["serving_default"]

    input_tokens = signature_def.inputs["tokens"].name
    input_length = signature_def.inputs["length"].name
    output_tokens = signature_def.outputs["tokens"].name
    output_length = signature_def.outputs["length"].name

    inputs = {
        input_tokens: [
            ["Hello", "world", "!", ""],
            ["How", "are", "you", "?"]],
        input_length: [3, 4]
    }

    batch_tokens, batch_length = sess.run(
        [output_tokens, output_length], feed_dict=inputs)

    for tokens, length in zip(batch_tokens, batch_length):
        tokens, length = tokens[0], length[0]  # Take the best hypothesis.
        length -= 1  # Ignore </s> token.
        print(tokens[:length])

My Path:
test.py
/1558393749/
├── assets
│ ├── src-vocab.txt
│ └── tgt-vocab.txt
├── saved_model.pb
└── variables
├── variables.data-00000-of-00001
└── variables.index

I found that some users need to manually force kernel loading. Can you also try to add after the tensorflow import:

from tensorflow.contrib.seq2seq.python.ops import beam_search_ops

Worked. Thanks for your help. I will ask more questions :smiley: now you rest

I met the same problem. How did you solve it?