yashbonde/rasp

Implementing RASP transformer programming language https://arxiv.org/pdf/2106.06981.pdf.

35
/ 100
Emerging

This tool helps researchers and AI practitioners design and understand how Transformer neural networks process sequences. You provide a human-readable RASP program that defines a sequence manipulation task, and it outlines the corresponding neural network architecture. It's for anyone exploring the computational capabilities of Transformers or looking to translate high-level sequence logic into network components.

Use this if you want to conceptually program a Transformer's behavior for sequence tasks like reversing strings, counting frequencies, or sorting, and then see the underlying neural network structure.

Not ideal if you're looking for a general-purpose machine learning library to train existing Transformer models on large datasets, or if you need to deploy production-ready models.

AI-research natural-language-processing sequence-modeling neural-network-design computational-linguistics
No Package No Dependents
Maintenance 6 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 5 / 25

How are scores calculated?

Stars

60

Forks

2

Language

Python

License

MIT

Last pushed

Nov 01, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/yashbonde/rasp"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.