TheHamkerCat/NSFW_Detection_API

Rest API Written In Python To Classify NSFW Images.

45
/ 100
Emerging

This tool helps content moderators, platform administrators, or social media managers automatically identify and categorize images that might be inappropriate for their audience. You provide it with an image, and it tells you if the image contains adult content like pornography or hentai, or if it's safe for work. It's designed for anyone needing to filter user-generated content or monitor visual media.

No commits in the last 6 months.

Use this if you need to automatically detect and classify sexually explicit or inappropriate images to maintain content standards on a platform.

Not ideal if you need to classify images based on highly specific or nuanced content categories beyond general NSFW detection.

content-moderation platform-management image-filtering social-media-management digital-safety
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 21 / 25

How are scores calculated?

Stars

68

Forks

33

Language

Python

License

MIT

Last pushed

Feb 11, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/TheHamkerCat/NSFW_Detection_API"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.