cure-lab/DiffGuard

[ICCV 2023] The official implementation of paper "DiffGuard: Semantic Mismatch-Guided Out-of-Distribution Detection using Pre-trained Diffusion Models"

19
/ 100
Experimental

This project helps machine learning engineers and researchers validate the robustness of image classification models. It takes an existing image classifier and images that might be unusual or unexpected, and it outputs a flag indicating if the images are 'out-of-distribution.' This helps ensure that classifiers only make predictions on data they were trained to understand, preventing errors from unfamiliar inputs.

No commits in the last 6 months.

Use this if you need to reliably detect when an image fed to your classifier is semantically different from the data it was trained on.

Not ideal if your goal is to improve image quality, generate new images, or if you are not working with image classification models.

image-classification machine-learning-robustness out-of-distribution-detection computer-vision model-validation
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 8 / 25
Community 5 / 25

How are scores calculated?

Stars

18

Forks

1

Language

Python

License

Last pushed

Jan 08, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/cure-lab/DiffGuard"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.