Batch Normalization / L2 regularization

Hello team,

Just wondering if you ever tested these techniques as an alternative to dropout.



There was a pull request implementing recurrent batch normalization normalization but it did not work at all.

Are you referring to a paper or another framework?

I see this:

so, nobody actually fixed it, right ?