shaRk-033/ai.c

gpt written in plain c

44
/ 100
Emerging

This project allows developers to train a GPT-2 model from scratch using a C implementation. It takes raw text data as input, tokenizes it using Python, and then compiles and runs a C program to perform the training. The output is a trained language model that can generate text from a sequence of tokens.

132 stars.

Use this if you are a developer interested in understanding the low-level mechanics of large language models and want to learn by building one in plain C.

Not ideal if you are looking for a high-level library to easily integrate a pre-trained GPT model or if you prefer working with established machine learning frameworks.

Machine Learning Engineering Systems Programming Neural Network Architecture Deep Learning Implementation Algorithm Optimization
No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 8 / 25

How are scores calculated?

Stars

132

Forks

6

Language

C

License

MIT

Last pushed

Mar 11, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/shaRk-033/ai.c"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.