takara-ai/go-attention

A full attention mechanism and transformer in pure go.

38
/ 100
Emerging

This project provides core building blocks for AI models, specifically attention mechanisms and transformer layers. It takes in sequences of data, like text or time series, and processes them to identify relationships and patterns within the data. The output is transformed data that can be used for tasks like prediction or summarization. It's designed for software developers building high-performance AI applications, especially those needing efficient deployments on cloud servers or edge devices.

451 stars. No commits in the last 6 months.

Use this if you are a software developer building AI applications in Go and need robust, high-performance attention mechanisms or transformer layers without external dependencies.

Not ideal if you are a data scientist or machine learning engineer looking for a high-level Python library for rapid prototyping or model training.

AI-application-development NLP-engineering Time-series-analysis Edge-AI High-performance-computing
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 10 / 25

How are scores calculated?

Stars

451

Forks

15

Language

Go

License

MIT

Last pushed

Jul 18, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/takara-ai/go-attention"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.