Load checkpoint and compute Transformer encoder output failure


I am trying to load a Transformer model checkpoint in order to get the outputs of Transformer Encoder. I tried with the script OpenNMT-tf/examples/library/minimal_transformer_training.py and also to add a function in runner.py. Both got a error of Key encoder/LayerNorm/beta not found in checkpoint with SelfAttentionEncoder.encode when loading the checkpoint.

Both OpenNMT-tf-1.20.1 with tf-1.4 and OpenNMT-tf-1.22.0 with tf-1.13.1 did not work.

Any help to solve the problem?


You probably need to wrap your code with an additional variable scope:

with tf.variable_scope("transformer"):

Yeah, you are right! Problem solved. Thanks a lot!