Explore how queries attend to documents: Select Query position and document window, and see the redistribution of attention weights in real-time
Interactive Attention makes attention tangible: Move the query position and observe how attention weights redistribute across all documents. A tool for experimenting with context length and position effects.
Hands-on complement to the static heatmaps. Enables experiential learning about attention mechanisms.
Debugging LLM outputs often requires attention analysis. This interactive tool builds the necessary intuition to understand "what the model is looking at".