OurBigAdventure/Swift_NSFW_Detector
An NSFW image detector for Swift built as an extension on UIImage.
This tool helps you automatically identify and filter out inappropriate (NSFW) images that users might try to share within your app. It takes any image your users select and quickly determines if it contains adult or objectionable content, giving you a 'safe' or 'unsafe' rating along with a confidence score. Mobile app developers or product managers responsible for user-generated content platforms would find this useful.
No commits in the last 6 months.
Use this if you need a straightforward way to moderate user-submitted images in your iOS application to prevent the display of sexually explicit or otherwise inappropriate content.
Not ideal if you require highly nuanced content moderation beyond simple NSFW detection, such as identifying hate speech in images or very specific types of graphic violence.
Stars
14
Forks
2
Language
Swift
License
—
Category
Last pushed
Jan 08, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/OurBigAdventure/Swift_NSFW_Detector"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
infinitered/nsfwjs
NSFW detection on the client-side via TensorFlow.js
alex000kim/nsfw_data_scraper
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
mdietrichstein/tensorflow-open_nsfw
Tensorflow Implementation of Yahoo's Open NSFW Model
GantMan/nsfw_model
Keras model of NSFW detector
infinitered/nsfwjs-mobile
NSFWjs in React Native