nazim1021/OOD-detection-using-OECC
Outlier Exposure with Confidence Control for Out-of-Distribution Detection
This project helps machine learning engineers build more robust image and text classification systems. It takes a pre-trained deep neural network and a dataset of 'in-distribution' samples, then trains the network to reliably identify data points that are significantly different from what it was originally trained on. The output is a more confident and accurate classification model that can flag unusual or unexpected inputs.
No commits in the last 6 months.
Use this if you need your image or text classification models to reliably detect data points that fall outside of their expected input distribution, without needing separate examples of 'outlier' data.
Not ideal if your problem does not involve deep neural networks or if you already have a large, diverse dataset of known outlier samples.
Stars
71
Forks
8
Language
Jupyter Notebook
License
—
Category
Last pushed
Mar 13, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/nazim1021/OOD-detection-using-OECC"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
yzhao062/pyod
A Python Library for Outlier and Anomaly Detection, Integrating Classical and Deep Learning Techniques
unit8co/darts
A python library for user-friendly forecasting and anomaly detection on time series.
elki-project/elki
ELKI Data Mining Toolkit
raphaelvallat/antropy
AntroPy: entropy and complexity of (EEG) time-series in Python
Minqi824/ADBench
Official Implement of "ADBench: Anomaly Detection Benchmark", NeurIPS 2022.