hailo-ai/hailort

An open source light-weight and high performance inference framework for Hailo devices

51
/ 100
Established

This is a runtime library that helps developers integrate and run deep learning models on Hailo AI accelerator hardware. It allows you to take trained AI models and execute them efficiently on Hailo devices, producing inference results. The primary users are embedded systems engineers and AI solution developers who are building products with Hailo's specialized AI chips.

172 stars.

Use this if you are developing an embedded application that needs to perform high-performance AI inference directly on Hailo-10 or Hailo-15 AI accelerator devices.

Not ideal if you are looking for a general-purpose AI development framework that runs on standard CPUs or GPUs, or if your hardware is not a Hailo AI accelerator.

embedded-systems-development AI-inference edge-AI hardware-integration deep-learning-deployment
No License No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 23 / 25

How are scores calculated?

Stars

172

Forks

68

Language

C++

License

Last pushed

Feb 03, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/hailo-ai/hailort"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.