api4ai/nsfw-examples

API4AI is cloud-native computer vision & AI platform for startups, enterprises and individual developers. This repository contains sample mini apps that utilizes NSFW Content Recognition API provided by API4AI.

27
/ 100
Experimental

This helps developers integrate NSFW (Not Safe For Work) content detection into their applications. You input an image, and it tells you whether the image is NSFW or SFW (Safe For Work), along with a confidence score for each category. This is for software developers who need to automatically moderate user-generated content or filter visual content on their platforms.

Use this if you are a developer building an application and need to automatically identify and flag explicit or inappropriate visual content.

Not ideal if you are looking for a standalone content moderation tool for manual review or a non-developer solution.

content moderation image filtering application development platform safety user-generated content
No Package No Dependents
Maintenance 6 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 0 / 25

How are scores calculated?

Stars

14

Forks

Language

HTML

License

MIT

Last pushed

Oct 24, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/api4ai/nsfw-examples"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.