Shishir435/ollama-client
Ollama Client – Chat with Local LLMs Inside Your Browser A lightweight, privacy‑first Chrome extension to chat with local LLMs via Ollama, LM Studio, and llama.cpp. Supports streaming, stop/regenerate, RAG, and easy model switching — all without cloud APIs or data leaks.
This tool is a browser extension that lets you chat with AI models running directly on your computer, without sending any of your data to external cloud services. You type in questions or prompts, and the local AI model generates responses, all within your browser's side panel. It's designed for anyone who wants to use AI assistance while keeping their conversations and data completely private and offline.
Use this if you need a confidential AI assistant for tasks like drafting content, brainstorming, or summarizing documents, and you're already running local AI models (like Ollama, LM Studio, or llama.cpp) on your machine.
Not ideal if you expect the simplicity and reliability of cloud-based AI services, or if you don't want to manage local AI server software on your computer.
Stars
27
Forks
4
Language
TypeScript
License
MIT
Category
Last pushed
Feb 08, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/Shishir435/ollama-client"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.