rgklab/detectron
Official repository for the ICLR 2023 paper "A Learning Based Hypothesis Test for Harmful Covariate Shift"
This tool helps machine learning practitioners determine if a pre-trained model can be trusted to make reliable predictions on a new, unlabeled dataset. You provide your existing model and a new set of data, and it outputs a decision on whether the new data is similar enough to your model's training data. It's designed for data scientists, ML engineers, or anyone deploying models into real-world environments.
No commits in the last 6 months.
Use this if you need to automatically assess whether new, unlabeled data has 'shifted' too much from your model's original training data, potentially causing your model to perform poorly.
Not ideal if you already know your new data is significantly different and you need tools for model retraining or adaptation, rather than just detection.
Stars
11
Forks
3
Language
Python
License
GPL-3.0
Category
Last pushed
Jan 22, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/rgklab/detectron"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
adapt-python/adapt
Awesome Domain Adaptation Python Toolbox
corenel/pytorch-adda
A PyTorch implementation for Adversarial Discriminative Domain Adaptation
jindongwang/transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers,...
thuml/Transfer-Learning-Library
Transfer Learning Library for Domain Adaptation, Task Adaptation, and Domain Generalization
KaiyangZhou/Dassl.pytorch
A PyTorch toolbox for domain generalization, domain adaptation and semi-supervised learning.