NsfwSpy/NsfwSpy.js
A JavaScript image classifier used to identify explicit/pornographic content written in TypeScript.
This helps website and app developers automatically screen user-submitted images for explicit content. It takes an image file or element and categorizes it as pornography, sexy, hentai, or neutral. Developers would integrate this into their user-generated content moderation systems.
No commits in the last 6 months.
Use this if you need to automatically detect and filter nudity, pornography, or sexually suggestive images from user uploads in your web or Node.js application.
Not ideal if you need to classify only nude images without considering hentai or 'sexy' content, or if you are not a developer building a moderation system.
Stars
53
Forks
6
Language
TypeScript
License
MIT
Category
Last pushed
Oct 08, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/NsfwSpy/NsfwSpy.js"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
infinitered/nsfwjs
NSFW detection on the client-side via TensorFlow.js
alex000kim/nsfw_data_scraper
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
mdietrichstein/tensorflow-open_nsfw
Tensorflow Implementation of Yahoo's Open NSFW Model
GantMan/nsfw_model
Keras model of NSFW detector
infinitered/nsfwjs-mobile
NSFWjs in React Native