fredericksalazar/OllamaFX

OllamaFX is a native, lightweight, and professional JavaFX desktop client for Ollama. Run Llama 3, Mistral, and Phi-3 locally with maximum privacy, low RAM overhead, and a powerful interface designed for power users and developers.

38
/ 100
Emerging

This desktop application helps you privately run and chat with large language models (LLMs) like Llama 3 or Mistral directly on your computer. You can easily install, manage, and remove different models, then engage in multiple chat sessions with them through a clean, intuitive interface. It's designed for individuals who want to use advanced AI models for tasks without sending their data to external servers, prioritizing privacy and local control.

Use this if you need a user-friendly way to interact with powerful AI models locally on your computer, keeping your conversations private and under your control.

Not ideal if you need a cloud-based AI solution or do not have a computer capable of running LLMs locally.

local AI private AI chat desktop AI large language models personal AI assistant
No Package No Dependents
Maintenance 10 / 25
Adoption 8 / 25
Maturity 15 / 25
Community 5 / 25

How are scores calculated?

Stars

45

Forks

2

Language

Java

License

MIT

Last pushed

Mar 06, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/fredericksalazar/OllamaFX"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.