OpenNMT Forum

Erroring on inference

Hello,
I got an error during inference, any idea why this happened?

Traceback (most recent call last):
  File "/usr/local/bin/onmt-main", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.6/dist-packages/opennmt/bin/main.py", line 337, in main
    log_time=args.log_prediction_time,
  File "/usr/local/lib/python3.6/dist-packages/opennmt/runner.py", line 419, in infer
    log_time=log_time,
  File "/usr/local/lib/python3.6/dist-packages/opennmt/inference.py", line 59, in predict_dataset
    predictions = infer_fn(features)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/def_function.py", line 828, in __call__
    result = self._call(*args, **kwds)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/def_function.py", line 855, in _call
    return self._stateless_fn(*args, **kwds)  # pylint: disable=not-callable
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/function.py", line 2943, in __call__
    filtered_flat_args, captured_inputs=graph_function.captured_inputs)  # pylint: disable=protected-access
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/function.py", line 1919, in _call_flat
    ctx, args, cancellation_manager=cancellation_manager))
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/function.py", line 560, in call
    ctx=ctx)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/execute.py", line 60, in quick_execute
    inputs, attrs, num_outputs)
tensorflow.python.framework.errors_impl.InvalidArgumentError: 2 root error(s) found.
  (0) Invalid argument:  In[0] mismatch In[1] shape: 1 vs. 0: [0,256,1] [0,0,64] 0 0
	 [[node transformer_base_relative_1/self_attention_encoder_1/self_attention_encoder_layer_6/transformer_layer_wrapper_30/multi_head_attention_18/MatMul_3 (defined at /lib/python3.6/dist-packages/opennmt/layers/transformer.py:112) ]]
  (1) Invalid argument:  In[0] mismatch In[1] shape: 1 vs. 0: [0,256,1] [0,0,64] 0 0
	 [[node transformer_base_relative_1/self_attention_encoder_1/self_attention_encoder_layer_6/transformer_layer_wrapper_30/multi_head_attention_18/MatMul_3 (defined at /lib/python3.6/dist-packages/opennmt/layers/transformer.py:112) ]]
	 [[transformer_base_relative_1/StatefulPartitionedCall/Minimum/_168]]
0 successful operations.
0 derived errors ignored. [Op:__inference_infer_16054]

Errors may have originated from an input operation.
Input Source operations connected to node transformer_base_relative_1/self_attention_encoder_1/self_attention_encoder_layer_6/transformer_layer_wrapper_30/multi_head_attention_18/MatMul_3:
 transformer_base_relative_1/self_attention_encoder_1/self_attention_encoder_layer_6/transformer_layer_wrapper_30/multi_head_attention_18/embedding_lookup_1/Identity (defined at /lib/python3.6/dist-packages/opennmt/layers/transformer.py:295)	
 transformer_base_relative_1/self_attention_encoder_1/self_attention_encoder_layer_6/transformer_layer_wrapper_30/multi_head_attention_18/Reshape_5 (defined at /lib/python3.6/dist-packages/opennmt/layers/transformer.py:111)

Input Source operations connected to node transformer_base_relative_1/self_attention_encoder_1/self_attention_encoder_layer_6/transformer_layer_wrapper_30/multi_head_attention_18/MatMul_3:
 transformer_base_relative_1/self_attention_encoder_1/self_attention_encoder_layer_6/transformer_layer_wrapper_30/multi_head_attention_18/embedding_lookup_1/Identity (defined at /lib/python3.6/dist-packages/opennmt/layers/transformer.py:295)	
 transformer_base_relative_1/self_attention_encoder_1/self_attention_encoder_layer_6/transformer_layer_wrapper_30/multi_head_attention_18/Reshape_5 (defined at /lib/python3.6/dist-packages/opennmt/layers/transformer.py:111)

Function call stack:
infer -> infer

Hi,

Probably there is an empty line in your test file. Can you check that?

EDIT: we can make this part of the code more robust to empty inputs:

Thank you.