steelcityamir/safe-content-ai
A fast accurate API for detecting NSFW images.
This tool helps online platforms automatically check images for Not Safe For Work (NSFW) content. You provide it with images directly or links to images, and it tells you if the content is NSFW, along with a confidence score. It's designed for content moderators, community managers, or platform administrators who need to maintain a safe environment.
108 stars. No commits in the last 6 months.
Use this if you manage a digital platform or community and need an automated way to screen user-submitted images or external image links for inappropriate content quickly and accurately.
Not ideal if you need to detect nuanced content violations beyond general NSFW categories or require human-in-the-loop review for every flagged image.
Stars
108
Forks
16
Language
Python
License
MIT
Category
Last pushed
May 31, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/steelcityamir/safe-content-ai"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
infinitered/nsfwjs
NSFW detection on the client-side via TensorFlow.js
alex000kim/nsfw_data_scraper
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
mdietrichstein/tensorflow-open_nsfw
Tensorflow Implementation of Yahoo's Open NSFW Model
GantMan/nsfw_model
Keras model of NSFW detector
infinitered/nsfwjs-mobile
NSFWjs in React Native