How to define the train steps when finetune nllb-200

Hello, I have some doubt about the steps of finetune nllb-200. I have a dataset, both the source language and target language is 136409 lines. The batch_size is 384. So the steps should be:
136409 * 3 / 384 = 1066, 3 is the num of epochs. Am I right?

On my last train, I have set the train steps to 4000, when it goes to steps 1000, the train accuracy: 43.87361000, and when the steps increase, the accuracy is increase too. Is it ok to set to 4000 steps? will it overfitting? And what’s the accuracy is good?