LAION-AI/scaling-laws-openclip
Reproducible scaling laws for contrastive language-image learning (https://arxiv.org/abs/2212.07143)
This project helps machine learning researchers understand how different factors like dataset size and model architecture impact the performance of large language-image models (like CLIP). It provides the code and pre-trained models to reproduce experiments, allowing researchers to input training configurations and get insights into model performance and scaling behaviors. It's designed for machine learning scientists and researchers working on computer vision and natural language processing.
188 stars. No commits in the last 6 months.
Use this if you are a machine learning researcher or engineer studying the scaling properties of large language-image models and need to reproduce or build upon existing research.
Not ideal if you are a practitioner looking for a ready-to-use, off-the-shelf application or a tool for general image classification without deep model analysis.
Stars
188
Forks
11
Language
Jupyter Notebook
License
—
Category
Last pushed
Jun 21, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/LAION-AI/scaling-laws-openclip"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mlfoundations/open_clip
An open source implementation of CLIP.
noxdafox/clipspy
Python CFFI bindings for the 'C' Language Integrated Production System CLIPS
openai/CLIP
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
moein-shariatnia/OpenAI-CLIP
Simple implementation of OpenAI CLIP model in PyTorch.
BioMedIA-MBZUAI/FetalCLIP
Official repository of FetalCLIP: A Visual-Language Foundation Model for Fetal Ultrasound Image Analysis