libre-webui and llama.ui

These are competitors: both provide minimal web interfaces for running local language models, with overlapping functionality for Ollama/local AI interaction, though libre-webui emphasizes privacy and extensibility while llama.ui prioritizes in-browser execution.

libre-webui
64
Established
llama.ui
52
Established
Maintenance 10/25
Adoption 13/25
Maturity 24/25
Community 17/25
Maintenance 10/25
Adoption 10/25
Maturity 15/25
Community 17/25
Stars: 34
Forks: 8
Downloads: 316
Commits (30d): 0
Language: TypeScript
License: Apache-2.0
Stars: 175
Forks: 23
Downloads:
Commits (30d): 0
Language: TypeScript
License: MIT
No risk flags
No Package No Dependents

About libre-webui

libre-webui/libre-webui

Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible plugin system. No data leaves your device.

Libre WebUI is a self-hosted chat interface that lets you interact with AI models, either running locally on your computer or through various cloud services. You input your questions or prompts, and it provides AI-generated responses, images, or even speech. This is for anyone who wants to use AI chat tools but prioritize data privacy and control over their conversations and personal information.

personal-ai-assistant private-data-processing secure-communication local-ai-deployment document-interrogation

About llama.ui

olegshulyakov/llama.ui

A minimal interface for AI Companion that runs entirely in your browser.

This desktop application helps you chat privately with advanced AI models directly on your computer. You provide a large language model file, and the application gives you a friendly interface to type questions and receive AI responses. It's for anyone who wants to use AI for writing, brainstorming, or getting information without sending their data to cloud servers.

personal-ai private-chatbots local-ai-models offline-ai knowledge-worker-tools

Scores updated daily from GitHub, PyPI, and npm data. How scores work