hasanisaeed/C-Transformer

Implementation of the core Transformer architecture in pure C

33
/ 100
Emerging

This is a foundational code implementation of the Transformer neural network architecture. It takes raw text inputs and processes them through the various stages of the Transformer model, such as encoding, positional embedding, and attention mechanisms. The output provides numerical representations at each step, culminating in a predicted word. This is primarily a tool for machine learning engineers, AI researchers, or students who are building or studying deep learning models.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher who needs to understand or implement the core Transformer architecture at a low level in C.

Not ideal if you are looking for a pre-built, high-level library to apply Transformer models without needing to delve into the underlying C code.

deep-learning-engineering neural-network-development natural-language-processing AI-model-research low-level-ML-implementation
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

8

Forks

2

Language

C

License

MIT

Last pushed

Sep 08, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/hasanisaeed/C-Transformer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.