KolosalAI/kolosal-server

Kolosal AI is an OpenSource and Lightweight alternative to Ollama to run LLMs 100% offline on your device.

42
/ 100
Emerging

This project helps you run large language models (LLMs) like those from OpenAI completely on your own Windows or Linux computer, without needing an internet connection. You feed it a large language model file, and it gives you a local service that can answer your questions or generate text. It's designed for individuals, small businesses, or researchers who want to use advanced AI models privately and without external API costs.

Use this if you need to run large language models offline, keep your data private, or customize your AI applications without relying on cloud services.

Not ideal if you prefer simple, plug-and-play web applications for AI without any local setup or technical configuration.

AI-privacy offline-AI local-language-models data-security custom-AI-development
No Package No Dependents
Maintenance 6 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

13

Forks

5

Language

C++

License

Apache-2.0

Last pushed

Jan 02, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/KolosalAI/kolosal-server"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.