travishorn/ollapa
An Ollama client built with Svelte 5 and SvelteKit. Chat completely local and client-side with a friendly interface!
Ollapa provides a user-friendly chat interface for interacting with your locally installed AI models, like Llama 3.2, without sending your data to external servers. You provide your local AI model, and it gives you a private chat experience. Anyone concerned about data privacy when using AI models can benefit from this.
No commits in the last 6 months.
Use this if you want to chat with large language models like Llama 3.2 on your own computer, ensuring your conversations and data remain completely private and offline.
Not ideal if you don't have a powerful enough computer to run large language models locally or prefer to use cloud-based AI services.
Stars
8
Forks
2
Language
Svelte
License
MIT
Category
Last pushed
Nov 01, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/travishorn/ollapa"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.