g1ibby/ollama-auth
This project provides a Docker image for running the Ollama service with basic authentication using the Caddy server
This solution helps you run your Ollama large language model service securely by adding a layer of authentication. You provide an API key, and it ensures that only authorized applications or users can access your running Ollama instance and its models. This is for developers or system administrators who manage internal or external services powered by large language models.
Use this if you need to protect your Ollama instance from unauthorized access by requiring an API key for every request.
Not ideal if you're just running Ollama on your local machine for personal, isolated use and don't require any authentication.
Stars
17
Forks
10
Language
Shell
License
MIT
Category
Last pushed
Dec 15, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/g1ibby/ollama-auth"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ollama/ollama
Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.
jd-opensource/JDOxyGent4J
JDOxyGent4J: The Java sibling of the OxyGent ecosystem.
sammcj/gollama
Go manage your Ollama models
dext7r/ollama-api-pool
๐ Intelligent Ollama API proxy pool based on Cloudflare Workers - ๅบไบ Cloudflare Workers ็ๆบ่ฝ...
ollama4j/ollama4j-web-ui
Web UI for Ollama built in Java with Vaadin, Spring Boot and Ollama4j