OpenNMT-tf MixedInputter (BERT embedding + character embedding)

Hi,
I found that OpenNMT-tf MixedInputter allows to combine between word embedding and character embedding. Is there any way to use BERT word embedding in this case?
Thank you very much.

Hi,

It is possible to use pretrained word embeddings: http://opennmt.net/OpenNMT-tf/embeddings.html. Is it what you are looking for?

Let me know if you have issues configuring a model with mixed inputs.

Thank you very much for your quick reply. I try to use GLOVE word embedding, and character embedding, it works. However, I found that BERT word embedding is different, one word can have different embeddings depending on its context, so I cannot create a file (like glove-100000.txt) with one word and its embedding in one line.

I think that it is not possible to get embeddings from BERT. But you can do it from ELMo. If you want deep contextualized embeddings on character-level, ELMo is a good option too.