wangz10/contrastive_loss

Experiments with supervised contrastive learning methods with different loss functions

37
/ 100
Emerging

If you're building machine learning models for classification and want to improve their ability to distinguish between different categories, this project helps you experiment with supervised contrastive learning. It takes your labeled dataset and applies various contrastive loss functions during training to produce a more robust classification model. This is for machine learning practitioners and researchers looking to enhance model performance beyond standard classification techniques.

224 stars. No commits in the last 6 months.

Use this if you are a machine learning practitioner looking to explore advanced loss functions to improve the distinctiveness of representations learned by your classification models.

Not ideal if you are new to machine learning or only need basic classification models, as this delves into more experimental and specialized training techniques.

Machine-Learning-Research Classification-Modeling Deep-Learning-Optimization Supervised-Learning
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 19 / 25

How are scores calculated?

Stars

224

Forks

35

Language

Jupyter Notebook

License

Last pushed

Dec 08, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/wangz10/contrastive_loss"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.