tensorboard 1.13.1
tensorflow 1.13.1
tensorflow-estimator 1.13.0
tensorflow-gpu 1.13.1
I clean up my installation but I have same error. I send all error you.
2019-05-21 15:45:37.174863: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2
WARNING:tensorflow:From f:\Projects\1558393749\test.py:4: load (from tensorflow.python.saved_model.loader_impl) is deprecated and will be removed in a future version.
Instructions for updating:
This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.loader.load or tf.compat.v1.saved_model.load. There will be a new function for importing SavedModels in Tensorflow 2.0.
Traceback (most recent call last):
File “***\pythonFiles\ptvsd_launcher.py”, line 43, in
main(ptvsdArgs)
File “***\pythonFiles\lib\python\ptvsd_main_.py”, line 410, in main
run()
File “***\pythonFiles\lib\python\ptvsd_main_.py”, line 291, in run_file
runpy.run_path(target, run_name=‘main’)
File “***\Python\Python36\lib\runpy.py”, line 263, in run_path
pkg_name=pkg_name, script_name=fname)
File “***\Python\Python36\lib\runpy.py”, line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File “***\Python\Python36\lib\runpy.py”, line 85, in _run_code
exec(code, run_globals)
File “f:\Projects\1558393749\test.py”, line 4, in
meta_graph_def = tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], export_dir = “F:/Projects/1558393749”)
File ***\Python\Python36\lib\site-packages\tensorflow\python\util\deprecation.py", line 324, in new_func
return func(args, kwargs)
File "\Python\Python36\lib\site-packages\tensorflow\python\saved_model\loader_impl.py", line 269, in load
return loader.load(sess, tags, import_scope, saver_kwargs)
File "*\Python\Python36\lib\site-packages\tensorflow\python\saved_model\loader_impl.py", line 420, in load
saver_kwargs)
File "*\Python\Python36\lib\site-packages\tensorflow\python\saved_model\loader_impl.py", line 350, in load_graph
meta_graph_def, import_scope=import_scope, saver_kwargs)
File "*\Python\Python36\lib\site-packages\tensorflow\python\training\saver.py", line 1457, in _import_meta_graph_with_return_elements
kwargs))
File "*\Python\Python36\lib\site-packages\tensorflow\python\framework\meta_graph.py", line 806, in import_scoped_meta_graph_with_return_elements
return_elements=return_elements)
File “***\Python\Python36\lib\site-packages\tensorflow\python\util\deprecation.py”, line 507, in new_func
return func(args, kwargs)
File "\Python\Python36\lib\site-packages\tensorflow\python\framework\importer.py", line 399, in import_graph_def
_RemoveDefaultAttrs(op_dict, producer_op_list, graph_def)
File “***\Python\Python36\lib\site-packages\tensorflow\python\framework\importer.py”, line 159, in _RemoveDefaultAttrs
op_def = op_dict[node.op]
KeyError: ‘GatherTree’
My Code :
import tensorflow as tf
with tf.Session() as sess:
meta_graph_def = tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], export_dir = "F:/Projects/1558393749")
signature_def = meta_graph_def.signature_def["serving_default"]
input_tokens = signature_def.inputs["tokens"].name
input_length = signature_def.inputs["length"].name
output_tokens = signature_def.outputs["tokens"].name
output_length = signature_def.outputs["length"].name
inputs = {
input_tokens: [
["Hello", "world", "!", ""],
["How", "are", "you", "?"]],
input_length: [3, 4]
}
batch_tokens, batch_length = sess.run(
[output_tokens, output_length], feed_dict=inputs)
for tokens, length in zip(batch_tokens, batch_length):
tokens, length = tokens[0], length[0] # Take the best hypothesis.
length -= 1 # Ignore </s> token.
print(tokens[:length])
My Path:
test.py
/1558393749/
├── assets
│ ├── src-vocab.txt
│ └── tgt-vocab.txt
├── saved_model.pb
└── variables
├── variables.data-00000-of-00001
└── variables.index