tensor-fusion/GPT-Haskell

A pure Haskell implementation of a decoder-only transformer (GPT)

18
/ 100
Experimental

This project helps researchers and students understand how Large Language Models (LLMs) work by providing a simplified version of the GPT-2 architecture. It takes GPT-2 model weights and tokenizer configurations as input and allows you to explore the internal workings of the decoder-only transformer and text generation. It's designed for those learning about or teaching deep learning and natural language processing.

No commits in the last 6 months.

Use this if you are a computer science student, researcher, or educator who wants to study the core mechanics of a GPT-like model implemented in a functional programming language like Haskell.

Not ideal if you're looking for a production-ready tool to build or deploy large-scale language models, or if you need to perform advanced natural language processing tasks.

natural-language-processing deep-learning machine-learning-education functional-programming AI-research
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 8 / 25
Community 4 / 25

How are scores calculated?

Stars

21

Forks

1

Language

Haskell

License

Last pushed

Jun 22, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/tensor-fusion/GPT-Haskell"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.