Has anyone experimented with the dense bridge?

I’m just wondering if anyone has experimented with the dense bridge for the encoder/decoder. I haven’t seen any studies that have evaluated different encoder/decoder bridges with language models, so I’d like to know if anyone has learned anything about this.

Hi @devinbostIL, I am not sure why you connect the bridges (http://opennmt.net/OpenNMT/training/models/#bridges) between encoder and decoder with language models. On our side, the experiments on the dense bridge are disappointing compared to default copy mode which is odd since dense bridges are superset of copy bridges.

1 Like

The intention is to inicialize the decoder state from last encoder layer. See Question Generation… from Prakar Agrawal. I’m using OpenNMT-py, and the option is -bridge .