jaisidhsingh/pytorch-mixtures

One-stop solutions for Mixture of Expert modules in PyTorch.

46
/ 100
Emerging

This is a tool for machine learning engineers and researchers who are building or experimenting with large neural networks. It simplifies the process of integrating Mixture of Experts (MoE) layers into custom PyTorch models. You provide your existing neural network architecture and a list of 'expert' sub-networks, and it outputs an enhanced network capable of more efficient and specialized processing.

Available on PyPI.

Use this if you are a machine learning engineer or researcher looking to incorporate Mixture of Expert layers into your custom PyTorch neural networks with minimal effort to improve performance or efficiency.

Not ideal if you are not familiar with PyTorch or the concept of neural network architectures and their implementation, as this is a developer-focused tool.

deep-learning neural-networks model-architecture pytorch-development machine-learning-engineering
No Dependents
Maintenance 10 / 25
Adoption 7 / 25
Maturity 25 / 25
Community 4 / 25

How are scores calculated?

Stars

27

Forks

1

Language

Python

License

MIT

Last pushed

Feb 10, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/jaisidhsingh/pytorch-mixtures"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.