Mobile-Artificial-Intelligence/llama_sdk
lcpp is a dart implementation of llama.cpp used by the mobile artificial intelligence distribution (maid)
This is a Dart implementation of llama.cpp, allowing developers to integrate powerful large language models directly into their mobile and desktop applications. It takes model files and user prompts as input, and outputs text generated by the language model. This is for software developers building applications that require on-device AI capabilities.
115 stars.
Use this if you are a Dart/Flutter developer who needs to embed a large language model directly into your application for tasks like chatbots, content generation, or data analysis without relying on external APIs.
Not ideal if you are not a software developer or if you need a pre-built application rather than a programming library.
Stars
115
Forks
26
Language
C++
License
MIT
Category
Last pushed
Mar 13, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Mobile-Artificial-Intelligence/llama_sdk"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related models
ludwig-ai/ludwig
Low-code framework for building custom LLMs, neural networks, and other AI models
withcatai/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema...
mudler/LocalAI
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and...
zhudotexe/kani
kani (カニ) is a highly hackable microframework for tool-calling language models. (NLP-OSS @ EMNLP 2023)
SciSharp/LLamaSharp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.