tokentap and tokentop
These are direct competitors offering nearly identical functionality—real-time terminal dashboards for monitoring LLM token usage and costs—with tokentap being the more mature and adopted option.
About tokentap
jmuncor/tokentap
Intercept LLM API traffic and visualize token usage in a real-time terminal dashboard. Track costs, debug prompts, and monitor context window usage across your AI development sessions.
This tool helps AI application developers monitor and debug their interactions with large language models (LLMs) in real time. It intercepts your LLM API calls, displaying token usage, cost, and context window consumption in a live terminal dashboard. Developers building with LLM CLI tools for providers like Claude, OpenAI Codex, or MiniMax will find this essential for managing their AI-powered applications.
About tokentop
tokentopapp/tokentop
htop for your AI costs — real-time terminal monitoring of LLM token usage and spending across providers and coding agents
This tool helps software developers and AI engineers keep track of their spending when using large language models (LLMs) and AI coding agents. It provides a real-time terminal display showing how many tokens are being used and how much it's costing across various AI providers like OpenAI, Anthropic, and GitHub Copilot. The input is your ongoing AI development work, and the output is an immediate, consolidated view of your token usage and expenditures.
Scores updated daily from GitHub, PyPI, and npm data. How scores work