Can I change the batch_size when trainning in the next epoch?

Hi, I use setting single_pass: true to train in a epoch.
My vocab size is about:
source: 64000
target:84000

In the begining, I set batch_size about 2048 to matche the ram of the GPU (RTX2070 * 2, 8G RAM) and avoid the OOM problem which will happen if batch_size is more than 2048
And then I can set the batch_size up to 2500 and training without OOM in the next epoch.

So can I change the batch_size when trainning in the next epoch? why the batch_size can’t set much more in the beginning?

Hi,

Sure.

This sounds odd. Do you reproduce it consistently or was it a one time observation?

Yes, I have tried it for many times. It can be reproduced consistently.
The batch_size can be increased about 10%~20% after the beginning epoch.