well just the topic
Exist a way to use the word embedding trained in one model export it and use it as input for another model?
It’s technically possible, but not without work.
You can build a script that extracts the word embeddings from a checkpoint (the variable is name
w_embs) and serialize them in a text format like GloVe. Then configure the
WordEmbedder to use pretrained embeddings.