qeeqbox/image-analyzer
Interface for Image-Related Deep Learning Models (E.g. NSFW, MAYBE and SFW)
This tool helps categorize images based on their content, specifically identifying if they are safe, potentially inappropriate ('maybe'), or unsafe. You provide a collection of images along with your own pre-trained deep learning models, and it outputs classifications for each image. This is useful for content moderators, platform managers, or anyone needing to automatically filter large volumes of images.
No commits in the last 6 months.
Use this if you need to automate the filtering and categorization of images for content safety, using your own customized deep learning models.
Not ideal if you need an out-of-the-box solution with pre-trained models, as it requires you to provide and manage your own image classification models.
Stars
34
Forks
3
Language
HTML
License
AGPL-3.0
Category
Last pushed
Apr 15, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/qeeqbox/image-analyzer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
infinitered/nsfwjs
NSFW detection on the client-side via TensorFlow.js
alex000kim/nsfw_data_scraper
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
mdietrichstein/tensorflow-open_nsfw
Tensorflow Implementation of Yahoo's Open NSFW Model
GantMan/nsfw_model
Keras model of NSFW detector
infinitered/nsfwjs-mobile
NSFWjs in React Native