patrickjohncyh/fashion-clip
FashionCLIP is a CLIP-like model fine-tuned for the fashion domain.
FashionCLIP helps fashion professionals, like e-commerce merchandisers or stylists, understand and categorize clothing items more effectively. It takes images of fashion products and their descriptions, then processes them to identify similar styles, colors, and types of apparel. This allows users to improve product search, recommendations, and categorization workflows.
497 stars. No commits in the last 6 months.
Use this if you need to accurately classify fashion items, retrieve similar products based on images or text descriptions, or enhance search capabilities within the fashion domain.
Not ideal if your primary use case involves visual data outside of clothing and accessories, or if you require an extremely lightweight solution for general image recognition.
Stars
497
Forks
52
Language
Python
License
MIT
Category
Last pushed
Jan 30, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/patrickjohncyh/fashion-clip"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ClipsAI/clipsai
Clips AI is an open-source Python library that automatically converts long videos into clips.
ai-forever/ru-clip
CLIP implementation for Russian language
Lednik7/CLIP-ONNX
It is a simple library to speed up CLIP inference up to 3x (K80 GPU)
suinleelab/CellCLIP
[NeurIPS 2025] CellCLIP – Learning Perturbation Effects in Cell Painting via Text-Guided...
cene555/ruCLIP-SB
RuCLIP-SB (Russian Contrastive Language–Image Pretraining SWIN-BERT) is a multimodal model for...