Using pretrained Elmo embeddings

I would like to use the Elmo embeddings in my models, since that seems to be preferable over pretrained Glove/Word2vec embeddings.

However I’m not sure how to add them in my OpenNMT pipeline (they’re in hdf5 format), since the converting script only supports word2vec/glove. Is it possible at all? Should I change the Elmo format to Glove format perhaps and then do the conversion?

If this is currently impossible, consider this a feature request :slight_smile:

Thanks in advance for answering.

1 Like

No one knows?

Yes, you should try converting them to a simple format like Glove. This is a format that all OpenNMT implementations can read.

1 Like

looks like elmo is the new state of the art about word/sentence embeddings but its not a fixed vector like the glove I believe it has several layers.
Not sure how could be implemented

also interested in learning how to do this