SumitM0432/Explicit-Content-Classifier-using-ResNet

The objective of this project is to classify explicit content that contains inappropriate images like pornography and Hentai. The classifier used for this is ResNet50 and ResNet101 also known as Residual Neural Network. There are five categories that the model is trained on which are Porn, Hentai, Sexy, Drawing, and Neutral. Porn, Hentai, and Sexy can be classified as NSFW (Not Safe For Work) further and the other two are SFW (Safe For Work).

29
/ 100
Experimental

This project helps website administrators, content moderators, or platform managers automatically identify and filter explicit or inappropriate images. It takes a collection of images as input and categorizes each as 'Porn', 'Hentai', 'Sexy', 'Drawing', or 'Neutral', allowing for quick identification of Not Safe For Work (NSFW) content. Anyone responsible for maintaining a safe and appropriate online environment will find this tool useful.

No commits in the last 6 months.

Use this if you need to automatically screen large volumes of user-generated images or website content for explicit material to ensure compliance and maintain platform safety.

Not ideal if you need to classify explicit video content or require very fine-grained distinctions beyond the provided categories.

content-moderation website-safety image-filtering platform-management user-generated-content
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 16 / 25
Community 9 / 25

How are scores calculated?

Stars

7

Forks

1

Language

Jupyter Notebook

License

Apache-2.0

Last pushed

Jan 26, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/SumitM0432/Explicit-Content-Classifier-using-ResNet"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.