alibaba/MNN

MNN: A blazing-fast, lightweight inference engine battle-tested by Alibaba, powering high-performance on-device LLMs and Edge AI.

80
/ 100
Verified

This project helps developers integrate advanced AI capabilities, like large language models and image generation, directly into applications running on mobile phones, PCs, or IoT devices. It takes pre-trained AI models as input and delivers optimized, high-performance inference outputs, enabling features like offline AI chatbots or on-device image editing. This is for software engineers and product developers building AI-powered applications for edge devices.

14,526 stars. Actively maintained with 52 commits in the last 30 days. Available on PyPI.

Use this if you are a developer looking to embed performant AI features, such as conversational AI or creative image tools, directly into your mobile, desktop, or IoT applications without relying on cloud services.

Not ideal if you are an end-user simply looking for an AI application to use, rather than a developer building one.

edge-ai-development mobile-app-ai iot-ai on-device-machine-learning ai-model-deployment
Maintenance 22 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 23 / 25

How are scores calculated?

Stars

14,526

Forks

2,234

Language

C++

License

Apache-2.0

Last pushed

Mar 13, 2026

Commits (30d)

52

Dependencies

1

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/alibaba/MNN"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.