lakshaychhabra/NSFW-Detection-DL

This repo contains Deep learning Implementation for identifying NSFW images.

31
/ 100
Emerging

This helps social media managers, content moderators, and platform administrators automatically identify and flag explicit images. You provide it with images, and it tells you if they contain pornographic or suggestive content, classifying them as either 'Not Safe For Work' or 'Safe'. This allows you to protect your audience and maintain brand safety.

No commits in the last 6 months.

Use this if you need a fast and automated way to screen large volumes of images for inappropriate content on your website, app, or social media channels.

Not ideal if you need to detect nuanced forms of inappropriate content beyond explicit imagery, such as hate speech in text or graphic violence.

content-moderation brand-safety social-media-management platform-security image-filtering
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 8 / 25
Community 15 / 25

How are scores calculated?

Stars

53

Forks

9

Language

Jupyter Notebook

License

Last pushed

Mar 08, 2020

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/lakshaychhabra/NSFW-Detection-DL"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.