voidmind-io/voidllm
Privacy-first LLM proxy and AI gateway — load balancing, multi-provider routing, API key management, usage tracking, rate limiting. Self-hosted. Zero knowledge of your prompts.
This is a self-hosted tool that acts as a central control point for accessing various Large Language Models (LLMs) like OpenAI or Anthropic. It takes your applications' requests for LLMs and routes them efficiently and securely to different providers, giving you full control over usage and costs. It's designed for engineering leads, IT operations managers, or enterprise architects who need to manage AI resources across their organization.
Use this if your team or organization uses multiple LLM providers and needs a centralized, private, and auditable way to manage access, control costs, and ensure reliability.
Not ideal if you are an individual developer only using one LLM provider and don't require advanced access control, usage tracking, or high availability features.
Stars
16
Forks
2
Language
Go
License
—
Category
Last pushed
Mar 28, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/voidmind-io/voidllm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
BerriAI/litellm
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with...
vava-nessa/free-coding-models
Find, benchmark and install in CLI 158 FREE coding LLM models across 20 providers in real time
envoyproxy/ai-gateway
Manages Unified Access to Generative AI Services built on Envoy Gateway
theopenco/llmgateway
Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.
Portkey-AI/gateway
A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with...