sammcj/gollama
Go manage your Ollama models
This tool helps you organize and optimize the large language models (LLMs) you run locally using Ollama. You can easily view, sort, modify, and delete your models to keep your local setup efficient. It's designed for anyone experimenting with or regularly using local LLMs on macOS or Linux.
1,706 stars.
Use this if you manage multiple Ollama models and want a straightforward way to keep track of them, clean up old ones, or estimate their memory usage.
Not ideal if you rarely use Ollama or only ever work with one or two models at a time.
Stars
1,706
Forks
103
Language
Go
License
MIT
Category
Last pushed
Dec 30, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/sammcj/gollama"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ollama/ollama
Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.
jd-opensource/JDOxyGent4J
JDOxyGent4J: The Java sibling of the OxyGent ecosystem.
dext7r/ollama-api-pool
๐ Intelligent Ollama API proxy pool based on Cloudflare Workers - ๅบไบ Cloudflare Workers ็ๆบ่ฝ...
ollama4j/ollama4j-web-ui
Web UI for Ollama built in Java with Vaadin, Spring Boot and Ollama4j
g1ibby/ollama-auth
This project provides a Docker image for running the Ollama service with basic authentication...