bes-dev/pytorch_clip_guided_loss
A simple library that implements CLIP guided loss in PyTorch.
This is a library for machine learning engineers and researchers working on generative AI. It helps guide the creation of images from text descriptions or other images, or even generating text from images. You provide source images or text descriptions, and it helps refine the generated output to match your creative vision more closely.
No commits in the last 6 months.
Use this if you are a machine learning practitioner developing or experimenting with text-to-image, image-to-image, or image-to-text generative models and need a way to steer the generation process.
Not ideal if you are a non-technical end-user looking for a ready-to-use application for image generation, as this is a developer tool requiring coding knowledge.
Stars
77
Forks
3
Language
Python
License
Apache-2.0
Category
Last pushed
Dec 25, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/bes-dev/pytorch_clip_guided_loss"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mlfoundations/open_clip
An open source implementation of CLIP.
noxdafox/clipspy
Python CFFI bindings for the 'C' Language Integrated Production System CLIPS
openai/CLIP
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
moein-shariatnia/OpenAI-CLIP
Simple implementation of OpenAI CLIP model in PyTorch.
BioMedIA-MBZUAI/FetalCLIP
Official repository of FetalCLIP: A Visual-Language Foundation Model for Fetal Ultrasound Image Analysis