I try to fine-tune my model generic model.
What I did already:
- generate vocabulary with out-of-domain set and in-domain jointly
- train out-of-domain model with generated vocabulary
Now, I want to fine-tunning my out-of-domain model. I just change train_features_file and train_labels_file path in my config.
I get the following message:
WARNING:tensorflow:A checkpoint was restored (e.g. tf.train.Checkpoint.restore or tf.keras.Model.load_weights) but not all checkpointed values were used. See above for specific issues. Use expect_partial() on the load status object, e.g. tf.train.Checkpoint.restore(…).expect_partial(), to silence these warnings, or use assert_consumed() to make the check explicit. See https://www.tensorflow.org/guide/checkpoint#loading_mechanics for details.
The vocabulary was generated with out-of-domain and in-domain jointly so all tokens which appear in both of set are included in my vocab files.
Thank you in advance