hollance/YOLO-CoreML-MPSNNGraph
Tiny YOLO for iOS implemented using CoreML but also using the new MPS graph API.
This project helps iOS developers integrate real-time object detection into their apps. It takes an image as input and outputs the identified objects with bounding boxes around them. This is for developers building iOS applications that need to visually identify and locate multiple objects within images or live camera feeds.
945 stars. No commits in the last 6 months.
Use this if you are an iOS developer looking for a robust way to implement object detection in your apps, especially when exploring different performance options on iOS 11 (or before iOS 12's Vision framework support for YOLO-like models).
Not ideal if you are developing for iOS 12 or newer and prefer using the simpler Vision framework for object detection, or if you are not an iOS developer.
Stars
945
Forks
254
Language
Swift
License
MIT
Category
Last pushed
Nov 19, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/hollance/YOLO-CoreML-MPSNNGraph"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
apple/coremltools
Core ML tools contain supporting tools for Core ML model conversion, editing, and validation.
tensorflow/swift-apis
Swift for TensorFlow Deep Learning Library
hollance/Forge
A neural network toolkit for Metal
hanleyweng/CoreML-in-ARKit
Simple project to detect objects and display 3D labels above them in AR. This serves as a basic...
hollance/CoreMLHelpers
Types and functions that make it a little easier to work with Core ML in Swift.