TheHamkerCat/NSFW_Detection_API
Rest API Written In Python To Classify NSFW Images.
This tool helps content moderators, platform administrators, or social media managers automatically identify and categorize images that might be inappropriate for their audience. You provide it with an image, and it tells you if the image contains adult content like pornography or hentai, or if it's safe for work. It's designed for anyone needing to filter user-generated content or monitor visual media.
No commits in the last 6 months.
Use this if you need to automatically detect and classify sexually explicit or inappropriate images to maintain content standards on a platform.
Not ideal if you need to classify images based on highly specific or nuanced content categories beyond general NSFW detection.
Stars
68
Forks
33
Language
Python
License
MIT
Category
Last pushed
Feb 11, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/TheHamkerCat/NSFW_Detection_API"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
infinitered/nsfwjs
NSFW detection on the client-side via TensorFlow.js
alex000kim/nsfw_data_scraper
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
mdietrichstein/tensorflow-open_nsfw
Tensorflow Implementation of Yahoo's Open NSFW Model
GantMan/nsfw_model
Keras model of NSFW detector
infinitered/nsfwjs-mobile
NSFWjs in React Native