yolo-flutter-app and yolo-ios-app
These are ecosystem siblings—both are official reference implementations for deploying the same Ultralytics YOLO detection model across different mobile platforms (Flutter and iOS respectively), sharing the same underlying inference engine but targeting different development frameworks.
About yolo-flutter-app
ultralytics/yolo-flutter-app
Flutter plugin for Ultralytics YOLO
This is a Flutter plugin that allows mobile app developers to easily add advanced computer vision features like object detection, image classification, or pose estimation directly into their iOS and Android applications. Developers input video feeds or images into the plugin and receive information about objects, their locations, or classifications in real-time. This helps mobile developers create apps that can 'see' and understand their surroundings.
About yolo-ios-app
ultralytics/yolo-ios-app
Ultralytics YOLO iOS App source code for running YOLO in your own iOS apps 🌟
This project offers an iOS application and a Swift library for integrating real-time object detection into iPhone and iPad apps. It takes live camera feeds or images as input and outputs identified objects, their locations, and even their poses. This tool is for iOS app developers who want to add advanced computer vision features like detecting items in photos or segmenting objects from video streams directly on Apple devices.
Scores updated daily from GitHub, PyPI, and npm data. How scores work