MelbourneDeveloper/dart_llm

A tiny language model (LLM) written in pure Dart. It implements a minimal GPT-style network with a custom Tensor/autograd engine, trains on a small built‑in text sample, and prints a sampled continuation.

28
/ 100
Experimental

This project helps Dart developers learn about and experiment with foundational elements of large language models. It takes a small text sample and basic configuration parameters as input to train a GPT-style network. The output is a generated text continuation, allowing developers to see a basic LLM in action and understand its inner workings.

No commits in the last 6 months.

Use this if you are a Dart developer curious about the core components of LLMs and want a hands-on, minimal example to understand how they train and generate text.

Not ideal if you need a production-ready LLM solution or a robust framework for complex natural language processing tasks.

Dart-development machine-learning-education language-model-prototyping artificial-intelligence-learning deep-learning-fundamentals
No License Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 6 / 25
Maturity 7 / 25
Community 13 / 25

How are scores calculated?

Stars

17

Forks

3

Language

Dart

License

Last pushed

Aug 18, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/MelbourneDeveloper/dart_llm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.