M1chlCZ/nsfw_sherlock
Simple drop in API to determine if image is NSFW using TensorFlow
This tool helps content moderators, platform administrators, and community managers automatically identify inappropriate images and text. You input an image, and it tells you if the picture or any text within it contains Not Safe For Work (NSFW) content. It can also provide specific labels like 'porn' or 'hentai' with confidence scores, allowing for fine-tuned content filtering.
No commits in the last 6 months.
Use this if you need an automated way to screen user-submitted images or content for objectionable material before it goes live on your platform.
Not ideal if you require human-level nuanced understanding of context or highly subjective content moderation that goes beyond explicit imagery or text.
Stars
13
Forks
4
Language
Go
License
AGPL-3.0
Category
Last pushed
Aug 22, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/M1chlCZ/nsfw_sherlock"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
infinitered/nsfwjs
NSFW detection on the client-side via TensorFlow.js
alex000kim/nsfw_data_scraper
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
mdietrichstein/tensorflow-open_nsfw
Tensorflow Implementation of Yahoo's Open NSFW Model
GantMan/nsfw_model
Keras model of NSFW detector
infinitered/nsfwjs-mobile
NSFWjs in React Native