vikasverma1077/manifold_mixup
Code for reproducing Manifold Mixup results (ICML 2019)
This project helps machine learning researchers and practitioners train more robust and accurate deep learning models. By interpolating between the hidden states of different data examples during training, it produces more discriminative and compact data representations. This leads to models that perform better on various supervised learning tasks.
492 stars. No commits in the last 6 months.
Use this if you are a machine learning researcher or engineer looking to improve the generalization and robustness of your deep learning models, especially for image classification or other supervised learning tasks.
Not ideal if you are not working with deep learning models or are primarily focused on semi-supervised learning, for which a different project from the same authors is recommended.
Stars
492
Forks
62
Language
Python
License
—
Category
Last pushed
Mar 31, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/vikasverma1077/manifold_mixup"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Westlake-AI/openmixup
CAIRI Supervised, Semi- and Self-Supervised Visual Representation Learning Toolbox and Benchmark
YU1ut/MixMatch-pytorch
Code for "MixMatch - A Holistic Approach to Semi-Supervised Learning"
kamata1729/QATM_pytorch
Pytorch Implementation of QATM:Quality-Aware Template Matching For Deep Learning
nttcslab/msm-mae
Masked Spectrogram Modeling using Masked Autoencoders for Learning General-purpose Audio Representations
rgeirhos/generalisation-humans-DNNs
Data, code & materials from the paper "Generalisation in humans and deep neural networks" (NeurIPS 2018)