MoonshotAI/MoBA

MoBA: Mixture of Block Attention for Long-Context LLMs

44
/ 100
Emerging

This project helps AI engineers and researchers improve how large language models (LLMs) handle very long texts. It takes an existing LLM, trains it with a new attention mechanism, and produces a more efficient model that can process much longer inputs without a significant performance hit. This is for professionals building and deploying advanced LLMs who need to scale their models for extensive context understanding.

2,076 stars. No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher developing large language models and need to efficiently scale their ability to understand and process extremely long documents or conversations.

Not ideal if you are looking for a ready-to-use solution to immediately apply to an existing, pre-trained LLM without additional training, as MoBA requires continued training to achieve its benefits.

large-language-models natural-language-processing ai-model-training computational-efficiency deep-learning-research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 18 / 25

How are scores calculated?

Stars

2,076

Forks

136

Language

Python

License

MIT

Last pushed

Apr 03, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/MoonshotAI/MoBA"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.