mudler/LocalAI
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more. Features: Generate Text, MCP, Audio, Video, Images, Voice Cloning, Distributed, P2P and decentralized inference
LocalAI helps you run advanced AI models for tasks like generating text, images, or audio, right on your own computer or server. You provide the raw input (text, audio, images), and it delivers generated content, summaries, or analyses. This is for organizations and individuals who need to use large language models and other AI capabilities without sending their data to external cloud services.
43,530 stars. Actively maintained with 221 commits in the last 30 days.
Use this if you need to run AI models locally for privacy, cost control, or custom integrations, and want a flexible system that works on various hardware.
Not ideal if you prefer a fully managed cloud service for AI model inference without any local setup or hardware considerations.
Stars
43,530
Forks
3,679
Language
Go
License
MIT
Category
Last pushed
Mar 13, 2026
Commits (30d)
221
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/mudler/LocalAI"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Recent Releases
Related models
ludwig-ai/ludwig
Low-code framework for building custom LLMs, neural networks, and other AI models
withcatai/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema...
zhudotexe/kani
kani (カニ) is a highly hackable microframework for tool-calling language models. (NLP-OSS @ EMNLP 2023)
SciSharp/LLamaSharp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.
kaito-project/aikit
🏗️ Fine-tune, build, and deploy open-source LLMs easily!