Ruthwik000/tokenfirewall
Scalable LLM cost enforcement middleware for Node.js with budget protection and multi-provider support
This is a tool for developers building Node.js applications that use Large Language Models (LLMs) like OpenAI or Anthropic. It helps you manage and prevent unexpected spending on LLM API calls by automatically tracking costs and enforcing budget limits. You can connect it to your application to monitor usage, set spending caps, and even switch to backup models if a primary one fails, ensuring your application remains operational without exceeding your budget.
Available on npm.
Use this if you are a Node.js developer building an application that uses LLM APIs and you need to control spending, prevent budget overruns, and ensure application reliability with automatic model failover.
Not ideal if you are not a Node.js developer or if your primary concern is not LLM cost management and API reliability.
Stars
21
Forks
3
Language
TypeScript
License
MIT
Category
Last pushed
Feb 27, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/Ruthwik000/tokenfirewall"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jmuncor/tokentap
Intercept LLM API traffic and visualize token usage in a real-time terminal dashboard. Track...
AgentOps-AI/tokencost
Easy token price estimates for 400+ LLMs. TokenOps.
Merit-Systems/echo
The User Pays AI SDK
adarshxs/TokenTally
Estimate Your LLM's Token Toll Across Various Platforms and Configurations
azat-io/token-limit
🛰 Monitor how many tokens your code and configs consume in AI tools. Set budgets and get alerts...