pejuko/follamac
Follamac is an desktop application and provides convenient way to work with Ollama and large language models (LLMs).
This desktop application helps you easily interact with large language models running locally on your computer. You provide prompts or questions, and the application returns responses generated by the model. This is for anyone who wants to experiment with or use AI models like Mistral for tasks such as writing assistance, brainstorming, or getting information, without relying on cloud services.
No commits in the last 6 months.
Use this if you want a user-friendly interface to manage and chat with AI models you've set up with Ollama on your own machine.
Not ideal if you're looking for a cloud-based AI chatbot or don't want to run AI models directly on your computer.
Stars
26
Forks
4
Language
Vue
License
MIT
Category
Last pushed
Feb 23, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/pejuko/follamac"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.