GHuyHuynh/openllm-web
Open-source, self-hosted LLM chat application, featuring local-first data storage and real-time streaming responses.
This is an open-source, self-hosted web application that provides a chat interface for large language models (LLMs). You can type in prompts and receive streaming responses, similar to popular AI chatbots. All your conversations are saved directly in your browser, ensuring your data stays private and accessible offline. This tool is designed for anyone who wants to interact with an AI chatbot while maintaining full control over their data and conversation history.
Use this if you want a private, self-hosted chat interface to interact with AI models and keep all your conversation data stored locally on your own device.
Not ideal if you need a multi-user platform with server-side database storage or integration with external collaboration tools.
Stars
10
Forks
4
Language
TypeScript
License
MIT
Category
Last pushed
Oct 15, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/GHuyHuynh/openllm-web"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.