SumitM0432/Explicit-Content-Classifier-using-ResNet
The objective of this project is to classify explicit content that contains inappropriate images like pornography and Hentai. The classifier used for this is ResNet50 and ResNet101 also known as Residual Neural Network. There are five categories that the model is trained on which are Porn, Hentai, Sexy, Drawing, and Neutral. Porn, Hentai, and Sexy can be classified as NSFW (Not Safe For Work) further and the other two are SFW (Safe For Work).
This project helps website administrators, content moderators, or platform managers automatically identify and filter explicit or inappropriate images. It takes a collection of images as input and categorizes each as 'Porn', 'Hentai', 'Sexy', 'Drawing', or 'Neutral', allowing for quick identification of Not Safe For Work (NSFW) content. Anyone responsible for maintaining a safe and appropriate online environment will find this tool useful.
No commits in the last 6 months.
Use this if you need to automatically screen large volumes of user-generated images or website content for explicit material to ensure compliance and maintain platform safety.
Not ideal if you need to classify explicit video content or require very fine-grained distinctions beyond the provided categories.
Stars
7
Forks
1
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Jan 26, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/SumitM0432/Explicit-Content-Classifier-using-ResNet"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
infinitered/nsfwjs
NSFW detection on the client-side via TensorFlow.js
alex000kim/nsfw_data_scraper
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
mdietrichstein/tensorflow-open_nsfw
Tensorflow Implementation of Yahoo's Open NSFW Model
GantMan/nsfw_model
Keras model of NSFW detector
infinitered/nsfwjs-mobile
NSFWjs in React Native