Fine-tuning failed when loading checkpoint

Hi all,
I try to fine-tune my model generic model.
What I did already:

  1. generate vocabulary with out-of-domain set and in-domain jointly
  2. train out-of-domain model with generated vocabulary
    Now, I want to fine-tunning my out-of-domain model. I just change train_features_file and train_labels_file path in my config.

I get the following message:
WARNING:tensorflow:A checkpoint was restored (e.g. tf.train.Checkpoint.restore or tf.keras.Model.load_weights) but not all checkpointed values were used. See above for specific issues. Use expect_partial() on the load status object, e.g. tf.train.Checkpoint.restore(…).expect_partial(), to silence these warnings, or use assert_consumed() to make the check explicit. See https://www.tensorflow.org/guide/checkpoint#loading_mechanics for details.

The vocabulary was generated with out-of-domain and in-domain jointly so all tokens which appear in both of set are included in my vocab files.

Thank you in advance

Hi,

Is there an actual error message higher up in the logs?

Sorry, It was my fault. File does not exist. I just see that checking logs line by line. The message was pretty hide xd

Hi,
Could solve it?
I have the same problem when I want to Fine-tuning.