ncnn and armnn

These two inference frameworks are competitors, as both aim to provide optimized neural network execution, with ncnn having a broader focus on mobile platforms and Arm NN specifically targeting ARM-based devices.

ncnn
71
Verified
armnn
61
Established
Maintenance 20/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 10/25
Adoption 10/25
Maturity 16/25
Community 25/25
Stars: 22,890
Forks: 4,405
Downloads:
Commits (30d): 26
Language: C++
License:
Stars: 1,299
Forks: 328
Downloads:
Commits (30d): 0
Language: C++
License: MIT
No Package No Dependents
No Package No Dependents

About ncnn

Tencent/ncnn

ncnn is a high-performance neural network inference framework optimized for the mobile platform

This framework helps mobile app developers integrate high-performance artificial intelligence features into their applications. It takes pre-trained deep learning models and optimizes them to run efficiently on mobile phone CPUs, enabling intelligent functionalities directly on users' devices. The end-user persona for this project is a mobile application developer.

mobile-app-development deep-learning-deployment mobile-ai embedded-ai performance-optimization

About armnn

ARM-software/armnn

Arm NN ML Software.

This software helps embedded systems engineers and machine learning practitioners deploy machine learning models efficiently on devices powered by Arm processors, such as those found in smartphones or IoT devices. It takes your pre-trained TensorFlow Lite models and optimizes them to run faster on Arm Cortex-A CPUs and Arm Mali GPUs, providing accelerated machine learning inference. This is intended for developers working on applications that need to perform ML tasks directly on Arm-based hardware.

embedded-systems mobile-ml on-device-inference edge-ai hardware-acceleration

Related comparisons

Scores updated daily from GitHub, PyPI, and npm data. How scores work