mbzuai-oryx/MobiLlama

[ICLR-2025-SLLM Spotlight 🔥]MobiLlama : Small Language Model tailored for edge devices

44
/ 100
Emerging

MobiLlama helps integrate smart conversational AI directly into compact devices like smartphones, smart home gadgets, or small embedded systems. It takes text prompts and generates human-like text responses, allowing developers to build AI features that run locally on devices. This is for engineers and product managers creating applications where privacy, offline capability, or low power consumption are critical.

670 stars. No commits in the last 6 months.

Use this if you need to deploy a powerful language model directly onto a resource-constrained device, allowing for offline functionality and enhanced data privacy.

Not ideal if you require the absolute largest and most complex language model for tasks that can leverage massive cloud computing resources, as this prioritizes efficiency for smaller devices.

edge-computing on-device-ai embedded-systems natural-language-processing privacy-preserving-ai
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

670

Forks

52

Language

Python

License

Apache-2.0

Last pushed

May 10, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/mbzuai-oryx/MobiLlama"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.