Luodian/Generalizable-Mixture-of-Experts

GMoE could be the next backbone model for many kinds of generalization task.

41
/ 100
Emerging

This project helps machine learning researchers and practitioners develop models that perform well on new, unseen datasets without needing to retrain or fine-tune extensively. You provide existing image datasets, and it outputs a highly generalizable model that can make accurate predictions even when faced with significant shifts in data distribution. This is ideal for those building robust computer vision systems across diverse environments.

273 stars. No commits in the last 6 months.

Use this if you need to build computer vision models that reliably generalize across different data sources or operational environments without extensive retraining.

Not ideal if you are looking for a pre-trained model to use directly without further machine learning development.

domain adaptation image classification computer vision research generalization learning machine learning models
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

273

Forks

28

Language

Python

License

MIT

Last pushed

Mar 21, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Luodian/Generalizable-Mixture-of-Experts"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.