infinitered/nsfwjs-mobile
NSFWjs in React Native
This project helps mobile application developers integrate a feature that automatically identifies potentially indecent images directly within their app. It takes images uploaded or captured by users in a mobile app and classifies them into categories like 'sexy', 'pornography', or 'neutral', allowing for immediate filtering. Mobile app developers, especially those building social platforms, content-sharing apps, or community forums, would use this to ensure content moderation.
124 stars. No commits in the last 6 months.
Use this if you are a mobile app developer using React Native and need to automatically detect and filter user-generated indecent image content on the client side without sending it to a server.
Not ideal if you need server-side content moderation, advanced AI-driven content analysis beyond indecency, or if your application is not built with React Native.
Stars
124
Forks
25
Language
JavaScript
License
MIT
Category
Last pushed
Jan 26, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/infinitered/nsfwjs-mobile"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
infinitered/nsfwjs
NSFW detection on the client-side via TensorFlow.js
alex000kim/nsfw_data_scraper
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
mdietrichstein/tensorflow-open_nsfw
Tensorflow Implementation of Yahoo's Open NSFW Model
GantMan/nsfw_model
Keras model of NSFW detector
TheHamkerCat/NSFW_Detection_API
Rest API Written In Python To Classify NSFW Images.