open_clip and clip_playground

The highly popular and downloaded open-source CLIP implementation (A) serves as a foundational library that the demonstrative and less-used notebooks (B) would likely utilize to showcase CLIP's zero-shot capabilities; thus, they are complements.

open_clip
73
Verified
clip_playground
38
Emerging
Maintenance 13/25
Adoption 15/25
Maturity 25/25
Community 20/25
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 12/25
Stars: 13,496
Forks: 1,253
Downloads:
Commits (30d): 1
Language: Python
License:
Stars: 178
Forks: 13
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: MIT
No risk flags
Stale 6m No Package No Dependents

About open_clip

mlfoundations/open_clip

An open source implementation of CLIP.

This project provides pre-trained models that understand both images and text, allowing you to connect what you see with descriptive phrases. You can input an image and a list of text descriptions to get back probabilities of which description best matches the image. This is ideal for researchers or developers building applications that need to categorize images based on natural language or search for images using text.

image-text-matching zero-shot-classification multimodal-search computer-vision natural-language-processing

About clip_playground

kevinzakka/clip_playground

An ever-growing playground of notebooks showcasing CLIP's impressive zero-shot capabilities

This project provides interactive examples for exploring how AI can understand images and text together, even for concepts it hasn't been explicitly trained on. You input images and text descriptions, and it shows you how well the AI can recognize objects or ideas within those images based on your descriptions. Researchers, data scientists, or AI enthusiasts can use this to quickly test and visualize cutting-edge computer vision techniques.

computer-vision-research zero-shot-learning image-understanding model-explanation AI-experimentation

Scores updated daily from GitHub, PyPI, and npm data. How scores work