Weights initialization seems not random

Hello.
I trained a model with some parameters. Then I wanted to train the model with the same parameters again but from the different initial weights, just to see if it maybe would converge to a different point.
I assumed that I need to do no more than just to start train.lua again, since documentation states that parameters are initialized randomly from the uniform distribution (http://opennmt.net/OpenNMT/options/train/ parameter -param_init)
However, as the training of the second model proceeded, I noticed that its perplexity and BLEU scores are exactly the same as in the first model. After 30 epochs it is still so. Every score is completely the same. This led me to believe that the weights used for initialization were the same, and not random, which is counter-intuitive, and isn’t reflected in documentation.
So, I would be grateful if someone could explain:

  1. Is my understanding of the situation correct?
  2. If so, what should I do to initialize weights with true random values?

Hello,

The weights are effectively random but the random seed is fixed. See the section “Other options” in train.lua command line options:

http://opennmt.net/OpenNMT/options/train/#other-options

To introduce variations in your trainings, you should simply set another random seed.

1 Like

Thanks, that’s exactly what I hoped for.