Mixup Augmentation Frameworks Transformer Models

There are 6 mixup augmentation frameworks models tracked. The highest-rated is kyegomez/LIMoE at 48/100 with 36 stars.

Get all 6 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=transformers&subcategory=mixup-augmentation-frameworks&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Model Score Tier
1 kyegomez/LIMoE

Implementation of the "the first large-scale multimodal mixture of experts...

48
Emerging
2 dohlee/chromoformer

The official code implementation for Chromoformer in PyTorch. (Lee et al.,...

38
Emerging
3 ahans30/goldfish-loss

[NeurIPS 2024] Goldfish Loss: Mitigating Memorization in Generative LLMs

36
Emerging
4 yinboc/trans-inr

Transformers as Meta-Learners for Implicit Neural Representations, in ECCV 2022

36
Emerging
5 bloomberg/MixCE-acl2023

Implementation of MixCE method described in ACL 2023 paper by Zhang et al.

34
Emerging
6 ibnaleem/mixtral.py

A Python module for running the Mixtral-8x7B language model with...

27
Experimental