g1ibby/ollama-auth

This project provides a Docker image for running the Ollama service with basic authentication using the Caddy server

45
/ 100
Emerging

This solution helps you run your Ollama large language model service securely by adding a layer of authentication. You provide an API key, and it ensures that only authorized applications or users can access your running Ollama instance and its models. This is for developers or system administrators who manage internal or external services powered by large language models.

Use this if you need to protect your Ollama instance from unauthorized access by requiring an API key for every request.

Not ideal if you're just running Ollama on your local machine for personal, isolated use and don't require any authentication.

API security large language model deployment service authentication Docker security
No Package No Dependents
Maintenance 6 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 17 / 25

How are scores calculated?

Stars

17

Forks

10

Language

Shell

License

MIT

Last pushed

Dec 15, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/g1ibby/ollama-auth"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.