CodersCreative/ollama-chat-iced
A GUI made using iced and rust that allows you to talk to an Ollama AI.
This application helps you interact with various AI language models directly from your computer, without needing to use a web browser. You can type your questions or speak them, and the AI provides text responses, just like popular AI chatbots. This is for anyone who wants a private, local way to use AI models for brainstorming, writing, coding assistance, or general information.
Use this if you want to run and chat with multiple AI models locally on your computer, with features like voice input, markdown support, and the ability to manage different conversations in separate panels or windows.
Not ideal if you prefer cloud-based AI services, do not want to manage local model downloads, or need to integrate AI capabilities into other software via an API.
Stars
13
Forks
3
Language
Rust
License
—
Category
Last pushed
Jan 03, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/CodersCreative/ollama-chat-iced"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
dtnewman/zev
A simple CLI tool to help you remember terminal commands
hecrj/icebreaker
A local AI chat app powered by 🦀 Rust, 🧊 iced, 🤗 Hugging Face, and 🦙 llama.cpp
adammpkins/llama-terminal-completion
AI terminal assistant for any OpenAI-compatible API. Features interactive chat TUI, command...
milosgajdos/bot-banter
Go vs Rust AI bot voice conversation
dustinblackman/oatmeal
Terminal UI to chat with large language models (LLM) using different model backends, and...