Pervasive-AI-Lab/sparse-dynamic-synapses
"The Unreasonable Effectiveness of Sparse Dynamic Synapses for Continual Learning" paper project.
This project explores how neural networks can continuously learn new information without forgetting old knowledge, much like the human brain. It takes existing neural network architectures and modifies them to use 'sparse dynamic synapses,' meaning only a few connections are active at a time and new connections can grow. The output is a neural network model that can perform well on multiple tasks learned sequentially. This tool is for machine learning researchers and practitioners working on advanced AI models that need to adapt over time.
No commits in the last 6 months.
Use this if you are developing AI systems that need to learn new tasks incrementally while retaining previous capabilities, and you are exploring biologically inspired approaches to continual learning.
Not ideal if you are looking for a plug-and-play solution for general deep learning tasks or if your primary goal is to optimize for single-task performance.
Stars
11
Forks
1
Language
Python
License
—
Category
Last pushed
Dec 05, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Pervasive-AI-Lab/sparse-dynamic-synapses"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
aimagelab/mammoth
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of...
LAMDA-CL/PyCIL
PyCIL: A Python Toolbox for Class-Incremental Learning
GMvandeVen/continual-learning
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR,...
LAMDA-CL/LAMDA-PILOT
🎉 PILOT: A Pre-trained Model-Based Continual Learning Toolbox
mmasana/FACIL
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.