jakobhoeg/nextjs-ollama-llm-ui
Fully-featured web interface for Ollama LLMs
This is a user-friendly chat interface that lets you interact with Large Language Models (LLMs) running on your own computer. You feed it text prompts, and it generates responses, much like popular AI chatbots. Anyone who wants to experiment with or regularly use local LLMs for various tasks, without complex setup, would find this useful.
1,415 stars. No commits in the last 6 months.
Use this if you want a simple, local, and offline way to chat with various Large Language Models that you've installed with Ollama, using a familiar interface.
Not ideal if you need advanced features like prompt engineering, agentic workflows, or integration with external services, as this is a more basic, hobby-focused interface.
Stars
1,415
Forks
333
Language
TypeScript
License
MIT
Category
Last pushed
Jun 05, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/jakobhoeg/nextjs-ollama-llm-ui"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related tools
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.