Pervasive-AI-Lab/sparse-dynamic-synapses

"The Unreasonable Effectiveness of Sparse Dynamic Synapses for Continual Learning" paper project.

28
/ 100
Experimental

This project explores how neural networks can continuously learn new information without forgetting old knowledge, much like the human brain. It takes existing neural network architectures and modifies them to use 'sparse dynamic synapses,' meaning only a few connections are active at a time and new connections can grow. The output is a neural network model that can perform well on multiple tasks learned sequentially. This tool is for machine learning researchers and practitioners working on advanced AI models that need to adapt over time.

No commits in the last 6 months.

Use this if you are developing AI systems that need to learn new tasks incrementally while retaining previous capabilities, and you are exploring biologically inspired approaches to continual learning.

Not ideal if you are looking for a plug-and-play solution for general deep learning tasks or if your primary goal is to optimize for single-task performance.

continual-learning neural-networks sparse-models catastrophic-forgetting machine-learning-research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 7 / 25

How are scores calculated?

Stars

11

Forks

1

Language

Python

License

Last pushed

Dec 05, 2019

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Pervasive-AI-Lab/sparse-dynamic-synapses"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.