heyvaldemar/ollama-traefik-letsencrypt-docker-compose

Ollama with Let's Encrypt Using Docker Compose

38
/ 100
Emerging

This project helps DevOps engineers and IT professionals quickly set up a local large language model (LLM) server using Ollama, accessible securely over the internet. You input your desired configuration variables in an .env file, and it outputs a running Ollama service with automatic SSL certificates from Let's Encrypt, managed by Traefik, all orchestrated via Docker Compose.

Use this if you need to deploy Ollama with secure, web-accessible endpoints for local LLM development or testing, without manually configuring SSL.

Not ideal if you are not familiar with Docker, Docker Compose, or network configurations, or if you need a production-grade, highly available LLM infrastructure.

DevOps LLM deployment Docker orchestration network security local AI development
No License No Package No Dependents
Maintenance 10 / 25
Adoption 6 / 25
Maturity 8 / 25
Community 14 / 25

How are scores calculated?

Stars

23

Forks

4

Language

Shell

License

Last pushed

Feb 19, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/heyvaldemar/ollama-traefik-letsencrypt-docker-compose"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.