hasanisaeed/C-Transformer
Implementation of the core Transformer architecture in pure C
This is a foundational code implementation of the Transformer neural network architecture. It takes raw text inputs and processes them through the various stages of the Transformer model, such as encoding, positional embedding, and attention mechanisms. The output provides numerical representations at each step, culminating in a predicted word. This is primarily a tool for machine learning engineers, AI researchers, or students who are building or studying deep learning models.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher who needs to understand or implement the core Transformer architecture at a low level in C.
Not ideal if you are looking for a pre-built, high-level library to apply Transformer models without needing to delve into the underlying C code.
Stars
8
Forks
2
Language
C
License
MIT
Category
Last pushed
Sep 08, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/hasanisaeed/C-Transformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action