Aatricks/llmedge
Android native AI inference library, bringing gguf models and stable-diffusion inference on android devices, powered by llama.cpp and stable-diffusion.cpp
This is an Android library that allows app developers to build powerful AI features directly into their mobile apps. It lets apps take text prompts or images as input and produce AI-generated text, speech, images, or even short videos, all running on the user's device without needing an internet connection. Mobile app developers, particularly those creating apps for content creation, productivity, or accessibility, would use this to add advanced AI capabilities.
Use this if you are an Android developer looking to integrate on-device large language models, speech-to-text, text-to-speech, image generation, or video generation directly into your mobile application.
Not ideal if you need a pre-built end-user application or if you are not an Android developer looking to build custom mobile AI experiences.
Stars
37
Forks
4
Language
Kotlin
License
Apache-2.0
Category
Last pushed
Mar 11, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/Aatricks/llmedge"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
langbot-app/LangBot
Production-grade platform for building agentic IM bots - 生产级多平台智能机器人开发平台. 提供 Agent、知识库编排、插件系统 /...
open-webui/open-webui
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
cactus-compute/cactus
Low-latency AI engine for mobile devices & wearables
sigoden/aichat
All-in-one LLM CLI tool featuring Shell Assistant, Chat-REPL, RAG, AI Tools & Agents, with...
rudrankriyam/Foundation-Models-Framework-Example
Example apps for Foundation Models Framework in iOS 26 and macOS 26