SLAMPAI/large-scale-pretraining-transfer

Code for reproducing the experiments on large-scale pre-training and transfer learning for the paper "Effect of large-scale pre-training on full and few-shot transfer learning for natural and medical images" (https://arxiv.org/abs/2106.00116)

37
/ 100
Emerging

This project helps medical and natural image researchers apply advanced machine learning to new datasets more efficiently. It takes publicly available large-scale image datasets, pre-trains powerful models on them, and then lets you quickly fine-tune these models on your specific image classification tasks. Researchers can use this to build highly accurate image classifiers for medical conditions or other subjects with less effort and data.

No commits in the last 6 months.

Use this if you need to build high-performing image classification models for natural or medical images, especially when you have limited data for your specific task.

Not ideal if you are not comfortable with command-line operations and dataset preparation, or if your image analysis task is not classification.

medical-imaging radiology image-classification machine-learning-research diagnostic-imaging
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

19

Forks

4

Language

Jupyter Notebook

License

MIT

Last pushed

May 29, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/SLAMPAI/large-scale-pretraining-transfer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.