azat-io/token-limit
🛰 Monitor how many tokens your code and configs consume in AI tools. Set budgets and get alerts when limits are hit
This tool helps software developers manage the token consumption of their AI-related files, such as prompts, documentation, and configuration. It takes your project's text files as input and reports their token count and estimated cost for various AI models like OpenAI's GPT and Anthropic's Claude. Developers can use this to set budgets, get alerts, and prevent unexpected AI API costs or context window errors.
Available on npm.
Use this if you are a developer integrating AI into your projects and need to monitor, budget, and control the token usage and costs of your AI context files.
Not ideal if you are not working with AI models or are not concerned with token limits or costs for your text data.
Stars
61
Forks
1
Language
TypeScript
License
MIT
Category
Last pushed
Feb 08, 2026
Commits (30d)
0
Dependencies
6
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/azat-io/token-limit"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jmuncor/tokentap
Intercept LLM API traffic and visualize token usage in a real-time terminal dashboard. Track...
AgentOps-AI/tokencost
Easy token price estimates for 400+ LLMs. TokenOps.
Merit-Systems/echo
The User Pays AI SDK
Ruthwik000/tokenfirewall
Scalable LLM cost enforcement middleware for Node.js with budget protection and multi-provider support
adarshxs/TokenTally
Estimate Your LLM's Token Toll Across Various Platforms and Configurations