takara-ai/go-attention
A full attention mechanism and transformer in pure go.
This project provides core building blocks for AI models, specifically attention mechanisms and transformer layers. It takes in sequences of data, like text or time series, and processes them to identify relationships and patterns within the data. The output is transformed data that can be used for tasks like prediction or summarization. It's designed for software developers building high-performance AI applications, especially those needing efficient deployments on cloud servers or edge devices.
451 stars. No commits in the last 6 months.
Use this if you are a software developer building AI applications in Go and need robust, high-performance attention mechanisms or transformer layers without external dependencies.
Not ideal if you are a data scientist or machine learning engineer looking for a high-level Python library for rapid prototyping or model training.
Stars
451
Forks
15
Language
Go
License
MIT
Category
Last pushed
Jul 18, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/takara-ai/go-attention"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
microsoft/LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
jadore801120/attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
bhavnicksm/vanilla-transformer-jax
JAX/Flax implimentation of 'Attention Is All You Need' by Vaswani et al....
kyegomez/SparseAttention
Pytorch Implementation of the sparse attention from the paper: "Generating Long Sequences with...
AbdelStark/attnres
Rust implementation of Attention Residuals from MoonshotAI/Kimi