kyegomez/MHMoE
Community Implementation of the paper: "Multi-Head Mixture-of-Experts" In PyTorch
This is a machine learning tool that helps researchers and developers build more efficient and capable AI models, particularly those that process diverse types of information. It takes in various data representations, processes them through specialized 'experts', and outputs a richer, more context-aware understanding. Machine learning engineers and AI researchers who work on complex model architectures would find this useful.
Use this if you are developing advanced AI models that need to process and integrate information from multiple different data types or perspectives.
Not ideal if you are a practitioner looking for a ready-to-use AI application, as this is a component for building such applications.
Stars
29
Forks
5
Language
Python
License
MIT
Category
Last pushed
Jan 31, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/MHMoE"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
galilai-group/stable-pretraining
Reliable, minimal and scalable library for pretraining foundation and world models
CognitiveAISystems/MAPF-GPT
[AAAI-2025] This repository contains MAPF-GPT, a deep learning-based model for solving MAPF...
UKPLab/gpl
Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled...
larslorch/avici
Amortized Inference for Causal Structure Learning, NeurIPS 2022
svdrecbd/mhc-mlx
MLX + Metal implementation of mHC: Manifold-Constrained Hyper-Connections by DeepSeek-AI.