Adlith/MoE-Jetpack
[NeurIPS 24] MoE Jetpack: From Dense Checkpoints to Adaptive Mixture of Experts for Vision Tasks
This project helps machine learning engineers and researchers accelerate and improve the performance of their computer vision models. By taking an existing pre-trained vision model, it transforms it into a more efficient 'Mixture of Experts' (MoE) model. This results in faster training, higher accuracy, and better generalization across various image and video analysis tasks.
134 stars. No commits in the last 6 months.
Use this if you are building computer vision applications and want to achieve state-of-the-art performance with faster training times and improved model efficiency, especially for tasks involving image classification, object detection, or segmentation.
Not ideal if you are not working with pre-trained dense vision models or if your primary focus is on non-vision machine learning tasks.
Stars
134
Forks
1
Language
Python
License
Apache-2.0
Category
Last pushed
Nov 23, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Adlith/MoE-Jetpack"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AdaptiveMotorControlLab/CEBRA
Learnable latent embeddings for joint behavioral and neural analysis - Official implementation of CEBRA
theolepage/sslsv
Toolkit for training and evaluating Self-Supervised Learning (SSL) frameworks for Speaker...
PaddlePaddle/PASSL
PASSL包含 SimCLR,MoCo v1/v2,BYOL,CLIP,PixPro,simsiam, SwAV, BEiT,MAE 等图像自监督算法以及 Vision...
YGZWQZD/LAMDA-SSL
30 Semi-Supervised Learning Algorithms
ModSSC/ModSSC
ModSSC: A Modular Framework for Semi Supervised Classification