qeeqbox/image-analyzer

Interface for Image-Related Deep Learning Models (E.g. NSFW, MAYBE and SFW)

31
/ 100
Emerging

This tool helps categorize images based on their content, specifically identifying if they are safe, potentially inappropriate ('maybe'), or unsafe. You provide a collection of images along with your own pre-trained deep learning models, and it outputs classifications for each image. This is useful for content moderators, platform managers, or anyone needing to automatically filter large volumes of images.

No commits in the last 6 months.

Use this if you need to automate the filtering and categorization of images for content safety, using your own customized deep learning models.

Not ideal if you need an out-of-the-box solution with pre-trained models, as it requires you to provide and manage your own image classification models.

content-moderation image-filtering platform-safety digital-asset-management abusive-content-detection
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 8 / 25

How are scores calculated?

Stars

34

Forks

3

Language

HTML

License

AGPL-3.0

Last pushed

Apr 15, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/qeeqbox/image-analyzer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.