AbdelStark/attnres

Rust implementation of Attention Residuals from MoonshotAI/Kimi

42
/ 100
Emerging

This project helps machine learning researchers and Rust engineers experiment with a new type of neural network layer called Attention Residuals. It provides the building blocks for creating Transformer models that can learn to adjust how they combine information from different depths of the network. You input model configurations and data, and it outputs the processed data or model weights for analysis.

Use this if you are a researcher validating a paper's findings, or a Rust engineer building new Transformer models and want to explore the Attention Residuals concept.

Not ideal if you need a production-ready solution, require PyTorch ecosystem compatibility, or need validated performance on GPU deployments.

neural-networks transformer-models machine-learning-research large-language-models rust-ml-development
No Package No Dependents
Maintenance 13 / 25
Adoption 10 / 25
Maturity 9 / 25
Community 10 / 25

How are scores calculated?

Stars

47

Forks

5

Language

Rust

License

MIT

Last pushed

Mar 18, 2026

Monthly downloads

9

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/AbdelStark/attnres"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.