Exist a way to use the word embedding trained in one model export it and use it as input for another model?


(Lockder) #1

well just the topic :slight_smile:

(Guillaume Klein) #2

It’s technically possible, but not without work.

You can build a script that extracts the word embeddings from a checkpoint (the variable is name w_embs) and serialize them in a text format like GloVe. Then configure the WordEmbedder to use pretrained embeddings.

(Lockder) #3

thank you!!