I would like to use the Elmo embeddings in my models, since that seems to be preferable over pretrained Glove/Word2vec embeddings.
However I’m not sure how to add them in my OpenNMT pipeline (they’re in hdf5 format), since the converting script only supports word2vec/glove. Is it possible at all? Should I change the Elmo format to Glove format perhaps and then do the conversion?
If this is currently impossible, consider this a feature request
looks like elmo is the new state of the art about word/sentence embeddings but its not a fixed vector like the glove I believe it has several layers.
Not sure how could be implemented