infinitered/nsfwjs

NSFW detection on the client-side via TensorFlow.js

66
/ 100
Established

This tool helps website administrators, content moderators, and online platform managers automatically check images for inappropriate content directly within a user's web browser. It takes an image as input and classifies it into categories like 'Porn', 'Hentai', 'Sexy', 'Drawing', or 'Neutral', providing a judgment on its suitability. This allows for real-time content filtering without sending user data to a server.

8,818 stars. Actively maintained with 1 commit in the last 30 days. Available on npm.

Use this if you need to identify and filter out indecent images on your website or online platform, ensuring a safe content environment for your users, directly in their browser.

Not ideal if you require 100% perfect accuracy for highly sensitive applications, as the tool is not infallible, or if you need server-side image analysis.

content-moderation online-safety user-generated-content website-administration digital-publishing
No Dependents
Maintenance 13 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 18 / 25

How are scores calculated?

Stars

8,818

Forks

589

Language

TypeScript

License

MIT

Last pushed

Feb 23, 2026

Commits (30d)

1

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/infinitered/nsfwjs"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.