sunnynguyen-ai/llm-attention-visualizer
Interactive tool for analyzing attention patterns in transformer models with layer-wise visualizations, token importance scoring, and attention flow diagrams
This interactive tool helps you understand how large language models (LLMs) interpret text. You input text, and it shows you through various visualizations how the model's 'attention' mechanism focuses on different words and their relationships. It's designed for researchers, educators, or anyone needing to demystify how these AI models process information, without needing to write code.
No commits in the last 6 months.
Use this if you need to visually analyze and interpret the internal workings of transformer-based language models on specific text inputs.
Not ideal if you are looking for a tool to train models or perform model inference rather than understand their internal attention patterns.
Stars
14
Forks
13
Language
Python
License
MIT
Category
Last pushed
Sep 24, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/sunnynguyen-ai/llm-attention-visualizer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
microsoft/LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
jadore801120/attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
bhavnicksm/vanilla-transformer-jax
JAX/Flax implimentation of 'Attention Is All You Need' by Vaswani et al....
kyegomez/SparseAttention
Pytorch Implementation of the sparse attention from the paper: "Generating Long Sequences with...
AbdelStark/attnres
Rust implementation of Attention Residuals from MoonshotAI/Kimi