JHubi1/ollama-app

A modern and easy-to-use client for Ollama

53
/ 100
Established

This tool provides a user-friendly interface to interact with your local, private AI models. You input prompts or questions, and the application displays responses generated by an Ollama server running on your network. Anyone who wants to use large language models locally and privately, without sending data to external services, would benefit from this.

1,670 stars.

Use this if you want to chat with AI models you host yourself, ensuring your conversations remain entirely private within your local network.

Not ideal if you're looking for a cloud-based AI solution or do not have an Ollama server already set up.

local AI private LLM personal assistant data privacy AI interaction
No Package No Dependents
Maintenance 6 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 21 / 25

How are scores calculated?

Stars

1,670

Forks

196

Language

Dart

License

Apache-2.0

Last pushed

Oct 28, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/JHubi1/ollama-app"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.