Fixed embeddings

Hello,

With opennmt-tf I want to train my model using fixed subword embeddings (by subword I mean that I tokenize and then I apply bpe on my corpus before building embeddings with an external tool)

I found some explanations to use fixed embeddings but not with the tensorflow version, is it possible ?

Thanks for your time

Hi,

See this section and the flag trainable:

http://opennmt.net/OpenNMT-tf/data.html#pretrained-embeddings

2 Likes

My bad, I skipped this section during my readings

Thank you