bricks-cloud/BricksLLM
🔒 Enterprise-grade API gateway that helps you monitor and impose cost or rate limits per API key. Get fine-grained access control and monitoring per user, application, or environment. Supports OpenAI, Azure OpenAI, Anthropic, vLLM, and open-source LLMs.
This tool helps teams manage and control how their applications use large language models (LLMs) like OpenAI, Anthropic, or custom models. It acts as a central point to apply rules for usage, costs, and security to all LLM requests. Engineers, product managers, or business owners can use it to set limits on how much individual users or projects can spend or query, ensuring responsible and secure AI deployment.
1,173 stars. No commits in the last 6 months.
Use this if you need to manage access, monitor usage, control costs, and improve the reliability of LLMs being used across multiple applications or users within your organization.
Not ideal if you are a single user experimenting with LLMs and don't require granular control, monitoring, or enterprise-grade features.
Stars
1,173
Forks
93
Language
Go
License
MIT
Category
Last pushed
Jan 05, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/bricks-cloud/BricksLLM"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
BerriAI/litellm
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with...
vava-nessa/free-coding-models
Find, benchmark and install in CLI 158 FREE coding LLM models across 20 providers in real time
envoyproxy/ai-gateway
Manages Unified Access to Generative AI Services built on Envoy Gateway
theopenco/llmgateway
Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.
Portkey-AI/gateway
A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with...