Using pretrained Elmo embeddings


(R) #1

I would like to use the Elmo embeddings in my models, since that seems to be preferable over pretrained Glove/Word2vec embeddings.

However I’m not sure how to add them in my OpenNMT pipeline (they’re in hdf5 format), since the converting script only supports word2vec/glove. Is it possible at all? Should I change the Elmo format to Glove format perhaps and then do the conversion?

If this is currently impossible, consider this a feature request :slight_smile:

Thanks in advance for answering.


(R) #2

No one knows?


(Guillaume Klein) #3

Yes, you should try converting them to a simple format like Glove. This is a format that all OpenNMT implementations can read.


(Lockder) #4

looks like elmo is the new state of the art about word/sentence embeddings but its not a fixed vector like the glove I believe it has several layers.
Not sure how could be implemented


(Erik Chan) #5

also interested in learning how to do this