Word_vec_size when using pre-trained embeddings

Hi,

I’m using pre-trained embeddings to train the model. I wonder how the option word_vec_size would work when I also provide pre_word_vecs_enc and pre_word_vecs_dec.

Is the option word_vec_size only for alignment with pre-trained embeddings? Or is there further usage in the model?

Thanks!