ahmetkca/PolyOllama
Run multiple open source large language models concurrently powered by Ollama
This tool helps developers or researchers efficiently compare responses from various open-source large language models (LLMs) side-by-side. You input a prompt, and it simultaneously queries different models like Llama2, Mistral, and Gemma, displaying all their outputs at once. This is ideal for anyone evaluating or prototyping with multiple local LLMs.
No commits in the last 6 months.
Use this if you need to quickly see how different open-source large language models respond to the same query without running them sequentially.
Not ideal if you're looking for advanced LLM orchestration, production deployment, or integration with commercial cloud-based models.
Stars
20
Forks
1
Language
TypeScript
License
—
Category
Last pushed
Apr 14, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ahmetkca/PolyOllama"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.