lovoo/NSFWDetector

A NSFW (aka porn) detector with CoreML

44
/ 100
Emerging

This helps mobile app developers automatically filter images for nudity or sexually explicit content directly within their iOS applications. It takes an image as input and provides a confidence score indicating how likely it is to be inappropriate, allowing developers to decide whether to display or flag the content. This tool is for app developers building social platforms, content moderation tools, or any app that handles user-generated images.

1,654 stars. No commits in the last 6 months.

Use this if you are an iOS developer needing to quickly and efficiently detect inappropriate images within your app without relying on external servers.

Not ideal if you need to detect inappropriate content in video, text, or for platforms outside of iOS.

content-moderation iOS-development user-generated-content app-safety image-filtering
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 18 / 25

How are scores calculated?

Stars

1,654

Forks

117

Language

Swift

License

BSD-3-Clause

Last pushed

Aug 29, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/lovoo/NSFWDetector"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.