OurBigAdventure/Swift_NSFW_Detector

An NSFW image detector for Swift built as an extension on UIImage.

23
/ 100
Experimental

This tool helps you automatically identify and filter out inappropriate (NSFW) images that users might try to share within your app. It takes any image your users select and quickly determines if it contains adult or objectionable content, giving you a 'safe' or 'unsafe' rating along with a confidence score. Mobile app developers or product managers responsible for user-generated content platforms would find this useful.

No commits in the last 6 months.

Use this if you need a straightforward way to moderate user-submitted images in your iOS application to prevent the display of sexually explicit or otherwise inappropriate content.

Not ideal if you require highly nuanced content moderation beyond simple NSFW detection, such as identifying hate speech in images or very specific types of graphic violence.

content-moderation user-generated-content mobile-app-safety image-filtering iOS-development
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 10 / 25

How are scores calculated?

Stars

14

Forks

2

Language

Swift

License

Last pushed

Jan 08, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/OurBigAdventure/Swift_NSFW_Detector"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.