grouzen/ollana

Ollama over LAN - Auto-discover your Ollama server on your local network with hassle-free ease.

28
/ 100
Experimental

This tool allows you to access your Ollama large language model server from any device on your local network, without needing to manually configure network settings or IP addresses. It automatically discovers your Ollama server and provides secure, authenticated access. Anyone using Ollama for local AI experiments, development, or creative work who wants to access their models from different computers, tablets, or phones will find this useful.

Use this if you have an Ollama server running on one computer and want to access it securely from other devices on the same home or office network without hassle.

Not ideal if you need to access your Ollama server from outside your local network or require integration with complex enterprise authentication systems.

local-AI LLM-deployment network-connectivity developer-workflow personal-AI-server
No Package No Dependents
Maintenance 6 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 0 / 25

How are scores calculated?

Stars

7

Forks

Language

Rust

License

MIT

Last pushed

Jan 03, 2026

Monthly downloads

9

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/grouzen/ollana"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.