Visualize attention

Hi,

Is there a simple way visualize attention at inference time ? (OpenNMT-tf)

Thanks in advance

Hi,

Here is the function that prints the inference output:

You could change it to also dump the attention values.

Hi,

How about the Pytorch version? Does it have a way to see attention values?