zuramai/ChatLLM
Chat with LLM directly in your browser
Chat with large language models (LLMs) privately and directly within your web browser. You input text prompts, and the LLM generates responses right on your computer using its graphics card (GPU), keeping your conversations local. This is for individuals who want to experiment with AI chatbots without sending their data to external servers.
No commits in the last 6 months.
Use this if you want a private, local way to interact with LLMs directly in your browser without cloud services or complex installations.
Not ideal if you need access to the most powerful, cutting-edge LLMs that require significant cloud computing resources, or if you don't have a compatible GPU.
Stars
44
Forks
7
Language
Vue
License
MIT
Category
Last pushed
Dec 10, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/zuramai/ChatLLM"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.