uber-research/differentiable-plasticity
Implementations of the algorithms described in Differentiable plasticity: training plastic networks with gradient descent, a research paper from Uber AI Labs.
This project offers algorithms to train neural networks that can adapt and learn new information on their own, even after initial training. It takes existing neural network architectures and enables them to dynamically update their connections. Machine learning researchers and AI developers can use this to explore advanced forms of artificial intelligence.
410 stars. No commits in the last 6 months.
Use this if you are a machine learning researcher or AI developer exploring advanced network architectures capable of continuous, unsupervised learning and adaptation.
Not ideal if you are looking for a pre-built, production-ready AI solution or a tool for standard supervised learning tasks.
Stars
410
Forks
71
Language
Python
License
—
Category
Last pushed
Oct 23, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/uber-research/differentiable-plasticity"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
sbi-dev/sbi
sbi is a Python package for simulation-based inference, designed to meet the needs of both...
SMTorg/smt
Surrogate Modeling Toolbox
reservoirpy/reservoirpy
A simple and flexible code for Reservoir Computing architectures like Echo State Networks
GPflow/GPflow
Gaussian processes in TensorFlow
dswah/pyGAM
[CONTRIBUTORS WELCOME] Generalized Additive Models in Python