AnkurMali/ContinualPTNCN

We introduce Local recurrent Predictive coding model termed as Parallel temporal Neural Coding Network. Unlike classical RNNs, our model is pure local and doesn't require computing gradients backward in time; thus computationally more efficient compared to BPTT and can be used for online learning

35
/ 100
Emerging

This project offers a specialized neural network model, the Parallel Temporal Neural Coding Network (P-TNCN), that processes sequences of information more efficiently than traditional recurrent neural networks. It takes sequential data as input and learns patterns to make predictions or classify future elements in the sequence. It's designed for researchers and machine learning engineers working on advanced neural network architectures, particularly those interested in brain-inspired AI and online learning.

No commits in the last 6 months.

Use this if you are a researcher or AI engineer looking for a computationally efficient, brain-inspired recurrent neural network that supports online learning and does not require backpropagation through time.

Not ideal if you are looking for a plug-and-play solution for standard sequence modeling tasks without diving into the specifics of novel neural network architectures.

brain-inspired AI online learning recurrent neural networks machine learning research computational neuroscience
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 14 / 25

How are scores calculated?

Stars

12

Forks

3

Language

Python

License

MIT

Last pushed

Nov 29, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AnkurMali/ContinualPTNCN"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.