XiaoMi/mace
MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.
MACE helps mobile application developers integrate deep learning capabilities into their apps. It takes existing AI models from frameworks like TensorFlow or Caffe and optimizes them to run efficiently on mobile devices. The result is faster, more power-efficient, and responsive AI features within mobile applications for users of smartphones and other portable devices.
5,033 stars. No commits in the last 6 months.
Use this if you are a mobile app developer building applications that need to run AI models directly on a user's phone or other edge devices, requiring high performance and low power consumption.
Not ideal if you are developing AI models for server-side processing, cloud-based inference, or for desktop applications where mobile-specific optimizations are not relevant.
Stars
5,033
Forks
823
Language
C++
License
Apache-2.0
Category
Last pushed
Jun 17, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/XiaoMi/mace"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
iree-org/iree
A retargetable MLIR-based machine learning compiler and runtime toolkit.
brucefan1983/GPUMD
Graphics Processing Units Molecular Dynamics
uxlfoundation/oneDAL
oneAPI Data Analytics Library (oneDAL)
rapidsai/cuml
cuML - RAPIDS Machine Learning Library
NVIDIA/cutlass
CUDA Templates and Python DSLs for High-Performance Linear Algebra