voidmind-io/voidllm

Privacy-first LLM proxy and AI gateway — load balancing, multi-provider routing, API key management, usage tracking, rate limiting. Self-hosted. Zero knowledge of your prompts.

37
/ 100
Emerging

This is a self-hosted tool that acts as a central control point for accessing various Large Language Models (LLMs) like OpenAI or Anthropic. It takes your applications' requests for LLMs and routes them efficiently and securely to different providers, giving you full control over usage and costs. It's designed for engineering leads, IT operations managers, or enterprise architects who need to manage AI resources across their organization.

Use this if your team or organization uses multiple LLM providers and needs a centralized, private, and auditable way to manage access, control costs, and ensure reliability.

Not ideal if you are an individual developer only using one LLM provider and don't require advanced access control, usage tracking, or high availability features.

AI-resource-management LLM-operations API-governance cost-control data-privacy
No Package No Dependents
Maintenance 13 / 25
Adoption 6 / 25
Maturity 9 / 25
Community 9 / 25

How are scores calculated?

Stars

16

Forks

2

Language

Go

License

Category

llm-api-gateways

Last pushed

Mar 28, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/voidmind-io/voidllm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.