nart38/ollmao
Simple TUI client for ollama
This tool helps developers interact with their local AI models directly from the command line in a user-friendly text interface. You input your prompts and receive model responses, similar to a chat application. It's designed for developers who frequently use `ollama` for local large language model (LLM) interaction and prefer a terminal-based experience over web UIs or direct API calls.
No commits in the last 6 months.
Use this if you are a developer running `ollama` locally and want a simple, interactive chat interface within your terminal to experiment with different AI models.
Not ideal if you need advanced features like chat history export, multi-model comparisons, or a graphical user interface.
Stars
8
Forks
—
Language
Go
License
BSD-3-Clause
Category
Last pushed
Feb 27, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/nart38/ollmao"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ollama/ollama
Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.
jd-opensource/JDOxyGent4J
JDOxyGent4J: The Java sibling of the OxyGent ecosystem.
sammcj/gollama
Go manage your Ollama models
dext7r/ollama-api-pool
๐ Intelligent Ollama API proxy pool based on Cloudflare Workers - ๅบไบ Cloudflare Workers ็ๆบ่ฝ...
ollama4j/ollama4j-web-ui
Web UI for Ollama built in Java with Vaadin, Spring Boot and Ollama4j