libre-webui and llama.ui
These are competitors: both provide minimal web interfaces for running local language models, with overlapping functionality for Ollama/local AI interaction, though libre-webui emphasizes privacy and extensibility while llama.ui prioritizes in-browser execution.
About libre-webui
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible plugin system. No data leaves your device.
Libre WebUI is a self-hosted chat interface that lets you interact with AI models, either running locally on your computer or through various cloud services. You input your questions or prompts, and it provides AI-generated responses, images, or even speech. This is for anyone who wants to use AI chat tools but prioritize data privacy and control over their conversations and personal information.
About llama.ui
olegshulyakov/llama.ui
A minimal interface for AI Companion that runs entirely in your browser.
This desktop application helps you chat privately with advanced AI models directly on your computer. You provide a large language model file, and the application gives you a friendly interface to type questions and receive AI responses. It's for anyone who wants to use AI for writing, brainstorming, or getting information without sending their data to cloud servers.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work