DetoxAI/detoxai

Python toolkit for debiasing neural networks in image classification tasks

31
/ 100
Emerging

When developing or deploying AI models that use images, you might find that your model makes unfair decisions based on sensitive attributes like gender or race, even if you didn't intend for it to. This project helps you fix these 'biased' image classification models. You provide your existing image classification model and data, and it gives you a corrected version that is more fair and balanced. This is for AI practitioners, researchers, and developers who work with computer vision models and want to ensure their models are equitable.

No commits in the last 6 months.

Use this if you have a neural network that classifies images, and you're concerned it might be making biased decisions based on protected attributes present in your image data.

Not ideal if your problem isn't related to image classification, or if you're not working with neural networks.

AI-ethics image-classification model-fairness computer-vision bias-mitigation
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 4 / 25

How are scores calculated?

Stars

81

Forks

2

Language

Python

License

MIT

Last pushed

Oct 13, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/DetoxAI/detoxai"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.