hollance/YOLO-CoreML-MPSNNGraph

Tiny YOLO for iOS implemented using CoreML but also using the new MPS graph API.

51
/ 100
Established

This project helps iOS developers integrate real-time object detection into their apps. It takes an image as input and outputs the identified objects with bounding boxes around them. This is for developers building iOS applications that need to visually identify and locate multiple objects within images or live camera feeds.

945 stars. No commits in the last 6 months.

Use this if you are an iOS developer looking for a robust way to implement object detection in your apps, especially when exploring different performance options on iOS 11 (or before iOS 12's Vision framework support for YOLO-like models).

Not ideal if you are developing for iOS 12 or newer and prefer using the simpler Vision framework for object detection, or if you are not an iOS developer.

iOS-development mobile-app-development computer-vision real-time-object-detection
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 25 / 25

How are scores calculated?

Stars

945

Forks

254

Language

Swift

License

MIT

Last pushed

Nov 19, 2019

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/hollance/YOLO-CoreML-MPSNNGraph"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.