wangz10/contrastive_loss
Experiments with supervised contrastive learning methods with different loss functions
If you're building machine learning models for classification and want to improve their ability to distinguish between different categories, this project helps you experiment with supervised contrastive learning. It takes your labeled dataset and applies various contrastive loss functions during training to produce a more robust classification model. This is for machine learning practitioners and researchers looking to enhance model performance beyond standard classification techniques.
224 stars. No commits in the last 6 months.
Use this if you are a machine learning practitioner looking to explore advanced loss functions to improve the distinctiveness of representations learned by your classification models.
Not ideal if you are new to machine learning or only need basic classification models, as this delves into more experimental and specialized training techniques.
Stars
224
Forks
35
Language
Jupyter Notebook
License
—
Category
Last pushed
Dec 08, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/wangz10/contrastive_loss"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AdaptiveMotorControlLab/CEBRA
Learnable latent embeddings for joint behavioral and neural analysis - Official implementation of CEBRA
theolepage/sslsv
Toolkit for training and evaluating Self-Supervised Learning (SSL) frameworks for Speaker...
PaddlePaddle/PASSL
PASSL包含 SimCLR,MoCo v1/v2,BYOL,CLIP,PixPro,simsiam, SwAV, BEiT,MAE 等图像自监督算法以及 Vision...
YGZWQZD/LAMDA-SSL
30 Semi-Supervised Learning Algorithms
ModSSC/ModSSC
ModSSC: A Modular Framework for Semi Supervised Classification