LSTM based seq2seq generate 0, the padding token

It’s really wired, that i use the simplest lstm model, but the intermedia result generate almost 0, the padding token. I was confused.