How to visualize attention?

pytorch

(Jerin Philip) #1

Hello,

How to go about visualizing attention using OpenNMT-py?


Multiple tokens in Source to single token in Target
(Jerin Philip) #2

Figured some way around myself.

I used Translation class and logged everything required - src, tgt, pred, attentions into a file corresponding to the batches, which worked enough for me. Once you have (src, pred, attention matrix), it’s pretty straightforward.


(Arvid) #3

Thanks for your sharing, it helps me a lot!