api4ai/nsfw-examples
API4AI is cloud-native computer vision & AI platform for startups, enterprises and individual developers. This repository contains sample mini apps that utilizes NSFW Content Recognition API provided by API4AI.
This helps developers integrate NSFW (Not Safe For Work) content detection into their applications. You input an image, and it tells you whether the image is NSFW or SFW (Safe For Work), along with a confidence score for each category. This is for software developers who need to automatically moderate user-generated content or filter visual content on their platforms.
Use this if you are a developer building an application and need to automatically identify and flag explicit or inappropriate visual content.
Not ideal if you are looking for a standalone content moderation tool for manual review or a non-developer solution.
Stars
14
Forks
—
Language
HTML
License
MIT
Category
Last pushed
Oct 24, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/api4ai/nsfw-examples"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
infinitered/nsfwjs
NSFW detection on the client-side via TensorFlow.js
alex000kim/nsfw_data_scraper
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
mdietrichstein/tensorflow-open_nsfw
Tensorflow Implementation of Yahoo's Open NSFW Model
GantMan/nsfw_model
Keras model of NSFW detector
infinitered/nsfwjs-mobile
NSFWjs in React Native