woctezuma/stable-diffusion-safety-checker

Python package to apply the Safety Checker from Stable Diffusion.

40
/ 100
Emerging

This tool helps content moderators and platform managers automatically review images generated by Stable Diffusion. It takes a collection of images as input and identifies those that might contain "bad concepts" such as inappropriate or unsafe content. The output is a list of images flagged for review, helping maintain platform safety and compliance.

Use this if you need to automatically detect potentially unsafe or inappropriate content in image datasets, especially those generated by AI models like Stable Diffusion.

Not ideal if you need to detect highly nuanced or context-specific unsafe content, as it relies on predefined 'bad concepts'.

content-moderation AI-safety image-review platform-management brand-safety
No Package No Dependents
Maintenance 6 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

9

Forks

2

Language

Python

License

MIT

Last pushed

Dec 22, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/woctezuma/stable-diffusion-safety-checker"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.