thuml/LogME
Code release for "LogME: Practical Assessment of Pre-trained Models for Transfer Learning" (ICML 2021) and Ranking and Tuning Pre-trained Models: A New Paradigm for Exploiting Model Hubs (JMLR 2022)
This tool helps machine learning engineers and researchers quickly assess which pre-trained deep learning models will perform best on a new dataset or task, without needing extensive fine-tuning. You provide feature representations and corresponding labels from your dataset, and it outputs a 'LogME score' that predicts transfer learning performance. This allows you to efficiently select the most compatible pre-trained model for your specific classification or regression problem.
211 stars. No commits in the last 6 months.
Use this if you need to choose the best pre-trained model from a diverse collection (like a model hub) for a new task, and you want to avoid time-consuming hyper-parameter tuning for each candidate model.
Not ideal if you are looking for a tool to perform the actual fine-tuning of pre-trained models, as this focuses solely on ranking and selection.
Stars
211
Forks
18
Language
Python
License
MIT
Category
Last pushed
Oct 06, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/thuml/LogME"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
limix-ldm-ai/LimiX
LimiX: Unleashing Structured-Data Modeling Capability for Generalist Intelligence...
tatsu-lab/stanford_alpaca
Code and documentation to train Stanford's Alpaca models, and generate the data.
google-research/plur
PLUR (Programming-Language Understanding and Repair) is a collection of source code datasets...
YalaLab/pillar-finetune
Finetuning framework for Pillar medical imaging models.
joisino/reeval-wmd
Code for "Re-evaluating Word Mover’s Distance" (ICML 2022)