LAION-AI/scaling-laws-openclip

Reproducible scaling laws for contrastive language-image learning (https://arxiv.org/abs/2212.07143)

30
/ 100
Emerging

This project helps machine learning researchers understand how different factors like dataset size and model architecture impact the performance of large language-image models (like CLIP). It provides the code and pre-trained models to reproduce experiments, allowing researchers to input training configurations and get insights into model performance and scaling behaviors. It's designed for machine learning scientists and researchers working on computer vision and natural language processing.

188 stars. No commits in the last 6 months.

Use this if you are a machine learning researcher or engineer studying the scaling properties of large language-image models and need to reproduce or build upon existing research.

Not ideal if you are a practitioner looking for a ready-to-use, off-the-shelf application or a tool for general image classification without deep model analysis.

machine-learning-research computer-vision natural-language-processing model-scaling foundation-models
No License Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 10 / 25

How are scores calculated?

Stars

188

Forks

11

Language

Jupyter Notebook

License

Last pushed

Jun 21, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/LAION-AI/scaling-laws-openclip"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.