deployradiant/pajama
A UI for Ollama on Mac
Pajama helps Mac users easily interact with open-source language models running on their computer. You type in prompts or questions, and the application provides AI-generated text responses, making it simple to experiment with different models. This is for anyone on a Mac who wants to use local AI models without needing to write code or use command-line tools.
No commits in the last 6 months.
Use this if you are a macOS user looking for a straightforward, visual interface to chat with open-source AI models installed on your machine via Ollama.
Not ideal if you need to connect to cloud-based AI services like OpenAI or prefer to interact with models through a command line or API.
Stars
17
Forks
4
Language
Swift
License
MIT
Category
Last pushed
Jan 22, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/deployradiant/pajama"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.