infinitered/nsfwjs-mobile

NSFWjs in React Native

46
/ 100
Emerging

This project helps mobile application developers integrate a feature that automatically identifies potentially indecent images directly within their app. It takes images uploaded or captured by users in a mobile app and classifies them into categories like 'sexy', 'pornography', or 'neutral', allowing for immediate filtering. Mobile app developers, especially those building social platforms, content-sharing apps, or community forums, would use this to ensure content moderation.

124 stars. No commits in the last 6 months.

Use this if you are a mobile app developer using React Native and need to automatically detect and filter user-generated indecent image content on the client side without sending it to a server.

Not ideal if you need server-side content moderation, advanced AI-driven content analysis beyond indecency, or if your application is not built with React Native.

mobile-app-development content-moderation user-generated-content react-native image-filtering
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 20 / 25

How are scores calculated?

Stars

124

Forks

25

Language

JavaScript

License

MIT

Last pushed

Jan 26, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/infinitered/nsfwjs-mobile"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.