Dicklesworthstone/bio_inspired_nanochat

A version of Karpathy's Nanochat that attempts to capture more biologically inspired structure.

45
/ 100
Emerging

This project is for AI researchers and neuroscientists exploring how large language models can learn and adapt more like a biological brain. It takes standard language model inputs and generates text, but its internal connections change and evolve during use, mimicking real biological processes. This allows for dynamic memory, attention, and growth, moving beyond the static nature of traditional models.

Use this if you are a researcher focused on novel neural network architectures, biologically inspired AI, or the computational mechanisms behind brain function, and want to explore language models that can adapt and 'learn' in real-time during inference.

Not ideal if you are looking for an off-the-shelf, production-ready language model for general applications or a simple tool for immediate text generation tasks.

neuromorphic-computing computational-neuroscience language-model-research adaptive-AI cognitive-modeling
No Package No Dependents
Maintenance 10 / 25
Adoption 8 / 25
Maturity 13 / 25
Community 14 / 25

How are scores calculated?

Stars

45

Forks

7

Language

Python

License

Last pushed

Mar 12, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Dicklesworthstone/bio_inspired_nanochat"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.