azat-io/token-limit

🛰 Monitor how many tokens your code and configs consume in AI tools. Set budgets and get alerts when limits are hit

45
/ 100
Emerging

This tool helps software developers manage the token consumption of their AI-related files, such as prompts, documentation, and configuration. It takes your project's text files as input and reports their token count and estimated cost for various AI models like OpenAI's GPT and Anthropic's Claude. Developers can use this to set budgets, get alerts, and prevent unexpected AI API costs or context window errors.

Available on npm.

Use this if you are a developer integrating AI into your projects and need to monitor, budget, and control the token usage and costs of your AI context files.

Not ideal if you are not working with AI models or are not concerned with token limits or costs for your text data.

AI-development prompt-engineering software-cost-management CI/CD-integration API-cost-control
Maintenance 10 / 25
Adoption 8 / 25
Maturity 24 / 25
Community 3 / 25

How are scores calculated?

Stars

61

Forks

1

Language

TypeScript

License

MIT

Last pushed

Feb 08, 2026

Commits (30d)

0

Dependencies

6

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/azat-io/token-limit"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.