ronniross/attention-heatmap-visualizer
A set of scripts to generate full attention-head heatmaps for transformer-based LLMs
This tool helps AI researchers understand how large language models (LLMs) process text. You provide an input text prompt, and it generates detailed heatmaps showing which parts of the text the model 'pays attention' to across all its internal layers and heads. This allows AI researchers to visualize the model's focus, identifying patterns that inform debugging, bias detection, and architectural improvements.
Use this if you are an AI researcher wanting to visualize and analyze the internal workings of transformer-based language models to understand their attention mechanisms.
Not ideal if you are looking for a plug-and-play solution for production LLM deployments or do not need deep insights into model attention patterns.
Stars
13
Forks
2
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Feb 26, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ronniross/attention-heatmap-visualizer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AI-Hypercomputer/maxtext
A simple, performant and scalable Jax LLM!
rasbt/reasoning-from-scratch
Implement a reasoning LLM in PyTorch from scratch, step by step
mindspore-lab/mindnlp
MindSpore + 🤗Huggingface: Run any Transformers/Diffusers model on MindSpore with seamless...
mosaicml/llm-foundry
LLM training code for Databricks foundation models
rickiepark/llm-from-scratch
<밑바닥부터 만들면서 공부하는 LLM>(길벗, 2025)의 코드 저장소