GantMan/nsfw_model

Keras model of NSFW detector

48
/ 100
Emerging

This tool helps content moderators, platform managers, and community managers automatically identify and categorize images that might be inappropriate or sensitive. You feed it a single image, a list of images, or an entire folder of images, and it tells you whether each image contains drawings, hentai, neutral content, porn, or sexy content. This allows you to quickly filter out potentially problematic visuals before they reach your audience.

2,047 stars. No commits in the last 6 months.

Use this if you need an automated way to screen large volumes of images for various categories of adult or sensitive content.

Not ideal if you need to detect nuanced or context-specific inappropriate content that goes beyond visual categorization (e.g., hate speech in text overlaid on an image).

content-moderation platform-safety image-filtering community-management digital-publishing
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 22 / 25

How are scores calculated?

Stars

2,047

Forks

299

Language

Python

License

Last pushed

Feb 26, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/GantMan/nsfw_model"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.