Modifying the gradients during training

I am trying to “emphasize” some specific words in the output.

For “emphasizing”, I mean, let’s say, I have a set of important words. If these words are not translated correctly during training, I want a larger magnitude adjustment for the gradients. I was trying to modify the loss, it seems the loss is just a one dimension tensor. And modifying this result seemed not helpful.

Then I wanted to Modify the gradients directly. And gradients are updated here: https://github.com/OpenNMT/OpenNMT-py/blob/master/onmt/trainer.py#L294-L318

self.optim.step() will update the parameters. However, I don’t know how to modify it.

If there are some parameters, for example, currently are [1, 2, 3], and the gradients are [0.5, 0.5, 0.5]. I want to change the gradients to [0.5, 0.25, 0.5] and then update. I am trying to do something like this.

Any kind hints? Thanks a lot!

Did you try disabling the loss reduction?

Thanks a lot! I am busy preparing for interviews. I will try this later.