Helena-Yuhan-Liu/MDGL-main
Multidigraph learning (MDGL) for training recurrent spiking neural networks
This project provides a specialized tool for computational neuroscientists and machine learning researchers to train recurrent spiking neural networks that adhere to biological principles like Dale's law and connection sparsity. It takes in neural network architectures and task definitions, then applies a sophisticated learning rule to optimize the network's performance while generating detailed learning data for analysis. The output includes trained models and plots that demonstrate how the network learned specific tasks.
No commits in the last 6 months.
Use this if you are a computational neuroscientist or researcher studying brain-inspired AI and need to train biologically plausible recurrent spiking neural networks using state-of-the-art learning rules, and you require explicit access to the learning signals for in-depth analysis.
Not ideal if you need a memory-optimized solution for large-scale, general-purpose deep learning applications, or if you prefer to use automatic differentiation without needing to explicitly compute intermediate learning signals.
Stars
14
Forks
2
Language
Python
License
—
Category
Last pushed
Dec 18, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Helena-Yuhan-Liu/MDGL-main"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
kaanaksit/odak
Scientific computing library for optics, computer graphics and visual perception.
NVIDIA/torch-harmonics
Differentiable signal processing on the sphere for PyTorch
PreFab-Photonics/PreFab
Artificial nanofabrication of integrated photonic circuits using deep learning
MatthewFilipovich/torchoptics
Differentiable wave optics simulation library built on PyTorch
artificial-scientist-lab/XLuminA
XLuminA, a highly-efficient, auto-differentiating discovery framework for super-resolution microscopy.