Cartucho/mAP
mean Average Precision - This code evaluates the performance of your neural net for object recognition.
This tool helps evaluate how well your object recognition system performs. You provide files detailing the actual objects and their locations in images (ground truth) and separate files with your system's detected objects and their confidence scores. It calculates a 'mean Average Precision' (mAP) score, indicating the accuracy and reliability of your object detections. This is for engineers, researchers, or data scientists working on computer vision tasks.
2,980 stars. No commits in the last 6 months.
Use this if you need to quantify the performance of your object detection neural network using a standard, widely accepted metric like mAP.
Not ideal if you are looking for an object detection model itself, or need to evaluate image classification rather than object localization.
Stars
2,980
Forks
921
Language
Python
License
Apache-2.0
Category
Last pushed
Aug 15, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Cartucho/mAP"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
ultralytics/yolov5
YOLOv5 🚀 in PyTorch > ONNX > CoreML > TFLite
ultralytics/yolov3
YOLOv3 in PyTorch > ONNX > CoreML > TFLite
mindspore-lab/mindyolo
A toolbox of yolo models and algorithms based on MindSpore
ultralytics/assets
Ultralytics assets
stephanecharette/DarkHelp
C++ wrapper library for Darknet