NsfwSpy/NsfwSpy.js

A JavaScript image classifier used to identify explicit/pornographic content written in TypeScript.

36
/ 100
Emerging

This helps website and app developers automatically screen user-submitted images for explicit content. It takes an image file or element and categorizes it as pornography, sexy, hentai, or neutral. Developers would integrate this into their user-generated content moderation systems.

No commits in the last 6 months.

Use this if you need to automatically detect and filter nudity, pornography, or sexually suggestive images from user uploads in your web or Node.js application.

Not ideal if you need to classify only nude images without considering hentai or 'sexy' content, or if you are not a developer building a moderation system.

content-moderation user-generated-content image-screening online-safety web-development
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 12 / 25

How are scores calculated?

Stars

53

Forks

6

Language

TypeScript

License

MIT

Last pushed

Oct 08, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/NsfwSpy/NsfwSpy.js"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.