grouzen/ollana
Ollama over LAN - Auto-discover your Ollama server on your local network with hassle-free ease.
This tool allows you to access your Ollama large language model server from any device on your local network, without needing to manually configure network settings or IP addresses. It automatically discovers your Ollama server and provides secure, authenticated access. Anyone using Ollama for local AI experiments, development, or creative work who wants to access their models from different computers, tablets, or phones will find this useful.
Use this if you have an Ollama server running on one computer and want to access it securely from other devices on the same home or office network without hassle.
Not ideal if you need to access your Ollama server from outside your local network or require integration with complex enterprise authentication systems.
Stars
7
Forks
—
Language
Rust
License
MIT
Category
Last pushed
Jan 03, 2026
Monthly downloads
9
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/grouzen/ollana"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mishushakov/llm-scraper
Turn any webpage into structured data using LLMs
Mobile-Artificial-Intelligence/maid
Maid is a free and open source application for interfacing with llama.cpp models locally, and...
run-llama/LlamaIndexTS
Data framework for your LLM applications. Focus on server side solution
JHubi1/ollama-app
A modern and easy-to-use client for Ollama
serge-chat/serge
A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.