Cartucho/mAP

mean Average Precision - This code evaluates the performance of your neural net for object recognition.

51
/ 100
Established

This tool helps evaluate how well your object recognition system performs. You provide files detailing the actual objects and their locations in images (ground truth) and separate files with your system's detected objects and their confidence scores. It calculates a 'mean Average Precision' (mAP) score, indicating the accuracy and reliability of your object detections. This is for engineers, researchers, or data scientists working on computer vision tasks.

2,980 stars. No commits in the last 6 months.

Use this if you need to quantify the performance of your object detection neural network using a standard, widely accepted metric like mAP.

Not ideal if you are looking for an object detection model itself, or need to evaluate image classification rather than object localization.

object-detection computer-vision model-evaluation deep-learning robotics
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 25 / 25

How are scores calculated?

Stars

2,980

Forks

921

Language

Python

License

Apache-2.0

Last pushed

Aug 15, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Cartucho/mAP"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.