kartikm7/llocal
Aiming to provide a seamless and privacy driven chatting experience with open-sourced technologies(Ollama), particularly open sourced LLM's(eg. Llama3, Phi-3, Mistral). Focused on ease of use. Available on both Windows and Mac.
LLocal provides a private, secure way to chat with AI models directly on your computer, without sending your data to external servers. You can input text, upload documents (like PDFs or Word files), and even images to receive AI-generated responses, summaries, or insights. This tool is for anyone who wants to use AI chatbots while keeping their conversations and data completely confidential and offline.
167 stars. No commits in the last 6 months.
Use this if you need a privacy-focused AI chat application that processes all your data locally on your computer, without relying on cloud services.
Not ideal if you prefer web-based AI tools, or if you need advanced integrations with external software and cloud platforms.
Stars
167
Forks
24
Language
TypeScript
License
MIT
Category
Last pushed
Oct 08, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/kartikm7/llocal"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.