bricks-cloud/BricksLLM

🔒 Enterprise-grade API gateway that helps you monitor and impose cost or rate limits per API key. Get fine-grained access control and monitoring per user, application, or environment. Supports OpenAI, Azure OpenAI, Anthropic, vLLM, and open-source LLMs.

44
/ 100
Emerging

This tool helps teams manage and control how their applications use large language models (LLMs) like OpenAI, Anthropic, or custom models. It acts as a central point to apply rules for usage, costs, and security to all LLM requests. Engineers, product managers, or business owners can use it to set limits on how much individual users or projects can spend or query, ensuring responsible and secure AI deployment.

1,173 stars. No commits in the last 6 months.

Use this if you need to manage access, monitor usage, control costs, and improve the reliability of LLMs being used across multiple applications or users within your organization.

Not ideal if you are a single user experimenting with LLMs and don't require granular control, monitoring, or enterprise-grade features.

LLM-operations API-management cost-control AI-governance production-AI
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 18 / 25

How are scores calculated?

Stars

1,173

Forks

93

Language

Go

License

MIT

Category

llm-api-gateways

Last pushed

Jan 05, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/bricks-cloud/BricksLLM"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.