Tensorflow: Getting Attention Activations to Visualize Attention in Seq2Seq

Created on 27 Mar 2016  路  3Comments  路  Source: tensorflow/tensorflow

All attention papers feature some visualization of the attention weights on some input. Has anyone been able to run a sample through the Seq2Seq Attention Decoder model in translate.py and get the attention activations to do such a visualization?

Most helpful comment

The link is broken. What is the correct link?

All 3 comments

The attention mask is available as a tensor here :
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/seq2seq.py#L522

It should be easy to fetch it out during a run call and visualize it. You can try posting this to StackOverflow to see if someone in the general community has done this visualization. I am closing this issue, since we have the required functionality in TensorFlow.

The link is broken. What is the correct link?

same problem

Was this page helpful?
0 / 5 - 0 ratings