Pre-trained embedding in Transformer training

The OpenNMT-py documentation states about using pre-trained embedding (word2vec, GLOVE). My question is: ‘Can we train a transformer with pre-trained embedding as the model already has an embedding layer. Can FastText be used?’

1 Like

in theory it should work, OpenNMT-py/FAQ.md at v3.0 · OpenNMT/OpenNMT-py · GitHub

you just point to your embeddings and it will load them.
If you’re willing to test the v3.0 branch, then I’ll help debug if it doesn’t work.

Cheers.

1 Like

Should this work also for OpenNMT-tf?