qusaismael/localllm
Your LLM, Your Data , Your GUI.
This tool provides a private, web-based chat interface to interact with large language models (LLMs) running on your own computer. You feed it your questions or prompts, and it responds based on the language model you've chosen, keeping a record of your discussions. This is for anyone who wants to use AI chat privately, without sending their data to external services.
Use this if you need a secure and private way to chat with AI models using your own data, without an internet connection or cloud services.
Not ideal if you need advanced AI features like image generation, complex data analysis integrations, or real-time access to the very latest internet information.
Stars
93
Forks
13
Language
JavaScript
License
MIT
Category
Last pushed
Feb 05, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/qusaismael/localllm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related tools
mlc-ai/web-llm
High-performance In-browser LLM Inference Engine
e2b-dev/desktop
E2B Desktop Sandbox for LLMs. E2B Sandbox with desktop graphical environment that you can...
geekjr/quickai
QuickAI is a Python library that makes it extremely easy to experiment with state-of-the-art...
Azure-Samples/llama-index-javascript
This sample shows how to quickly get started with LlamaIndex.ai on Azure 🚀
AkagawaTsurunaki/zerolan-core
ZerolanCore integrates many open-source, locally deployable AI models, and aims to integrate a...