Helena-Yuhan-Liu/MDGL-main

Multidigraph learning (MDGL) for training recurrent spiking neural networks

31
/ 100
Emerging

This project provides a specialized tool for computational neuroscientists and machine learning researchers to train recurrent spiking neural networks that adhere to biological principles like Dale's law and connection sparsity. It takes in neural network architectures and task definitions, then applies a sophisticated learning rule to optimize the network's performance while generating detailed learning data for analysis. The output includes trained models and plots that demonstrate how the network learned specific tasks.

No commits in the last 6 months.

Use this if you are a computational neuroscientist or researcher studying brain-inspired AI and need to train biologically plausible recurrent spiking neural networks using state-of-the-art learning rules, and you require explicit access to the learning signals for in-depth analysis.

Not ideal if you need a memory-optimized solution for large-scale, general-purpose deep learning applications, or if you prefer to use automatic differentiation without needing to explicitly compute intermediate learning signals.

computational-neuroscience spiking-neural-networks biologically-plausible-ai neuromodulation neural-network-training
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 10 / 25

How are scores calculated?

Stars

14

Forks

2

Language

Python

License

Last pushed

Dec 18, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Helena-Yuhan-Liu/MDGL-main"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.