lovoo/NSFWDetector
A NSFW (aka porn) detector with CoreML
This helps mobile app developers automatically filter images for nudity or sexually explicit content directly within their iOS applications. It takes an image as input and provides a confidence score indicating how likely it is to be inappropriate, allowing developers to decide whether to display or flag the content. This tool is for app developers building social platforms, content moderation tools, or any app that handles user-generated images.
1,654 stars. No commits in the last 6 months.
Use this if you are an iOS developer needing to quickly and efficiently detect inappropriate images within your app without relying on external servers.
Not ideal if you need to detect inappropriate content in video, text, or for platforms outside of iOS.
Stars
1,654
Forks
117
Language
Swift
License
BSD-3-Clause
Category
Last pushed
Aug 29, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/lovoo/NSFWDetector"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
infinitered/nsfwjs
NSFW detection on the client-side via TensorFlow.js
alex000kim/nsfw_data_scraper
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
mdietrichstein/tensorflow-open_nsfw
Tensorflow Implementation of Yahoo's Open NSFW Model
GantMan/nsfw_model
Keras model of NSFW detector
infinitered/nsfwjs-mobile
NSFWjs in React Native