aetaix/ollami

Ollami is a frontend for Ollama, allowing user to quickly chat with their local model.

47
/ 100
Emerging

This tool helps you quickly experiment with different large language models (LLMs) right on your own computer, without needing cloud services. You type in your questions or prompts, and the application provides text responses, allowing you to compare how various models like Llama 3 or Mistral perform. It's designed for anyone curious about locally running LLMs, from researchers to content creators, who want to test model capabilities firsthand.

Use this if you want to interact with and evaluate various large language models directly on your machine for text generation and reasoning.

Not ideal if you require cloud-based model hosting or need to integrate LLM capabilities into a custom application via an API.

AI experimentation local LLM prompt engineering model evaluation text generation
No Package No Dependents
Maintenance 6 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 17 / 25

How are scores calculated?

Stars

68

Forks

12

Language

Svelte

License

MIT

Last pushed

Dec 03, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/aetaix/ollami"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.