brianberns/ModestGpt
A modest GPT written in F#
This project helps developers explore and understand the foundational mechanics of Generative Pre-trained Transformers (GPTs). It takes raw text data as input and trains a small, customizable language model capable of generating new, similar text. It's ideal for software engineers, data scientists, or AI researchers who want to learn how GPTs are built from the ground up on the .NET platform.
No commits in the last 6 months.
Use this if you are a .NET developer interested in deeply understanding the internal workings of a GPT model and experimenting with its components.
Not ideal if you need a production-ready, highly optimized, or large-scale language model for immediate application.
Stars
8
Forks
1
Language
F#
License
—
Category
Last pushed
Dec 03, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/brianberns/ModestGpt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
vixhal-baraiya/microgpt-c
The most atomic way to train and inference a GPT in pure, dependency-free C
milanm/AutoGrad-Engine
A complete GPT language model (training and inference) in ~600 lines of pure C#, zero dependencies
LeeSinLiang/microGPT
Implementation of GPT from scratch. Design to be lightweight and easy to modify.
dubzdubz/microgpt-ts
A complete GPT built from scratch in TypeScript with zero dependencies
biegehydra/NanoGptDotnet
A miniature large language model (LLM) that generates shakespeare like text written in C#....