Shishir435/ollama-client

Ollama Client – Chat with Local LLMs Inside Your Browser A lightweight, privacy‑first Chrome extension to chat with local LLMs via Ollama, LM Studio, and llama.cpp. Supports streaming, stop/regenerate, RAG, and easy model switching — all without cloud APIs or data leaks.

44
/ 100
Emerging

This tool is a browser extension that lets you chat with AI models running directly on your computer, without sending any of your data to external cloud services. You type in questions or prompts, and the local AI model generates responses, all within your browser's side panel. It's designed for anyone who wants to use AI assistance while keeping their conversations and data completely private and offline.

Use this if you need a confidential AI assistant for tasks like drafting content, brainstorming, or summarizing documents, and you're already running local AI models (like Ollama, LM Studio, or llama.cpp) on your machine.

Not ideal if you expect the simplicity and reliability of cloud-based AI services, or if you don't want to manage local AI server software on your computer.

private-chat-ai offline-ai-assistant local-data-processing personal-productivity secure-information-handling
No Package No Dependents
Maintenance 10 / 25
Adoption 7 / 25
Maturity 15 / 25
Community 12 / 25

How are scores calculated?

Stars

27

Forks

4

Language

TypeScript

License

MIT

Last pushed

Feb 08, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/Shishir435/ollama-client"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.