zimmerrol/keras-utility-layer-collection
Collection of custom layers and utility functions for Keras which are missing in the main framework.
This collection of Keras layers helps deep learning researchers and practitioners re-implement advanced neural network architectures, especially those involving attention mechanisms. It takes existing Keras models and adds specialized layers like Multi-Head Attention or Layer Normalization to reproduce cutting-edge research. The primary user is a machine learning researcher or engineer working with deep learning models in Keras.
No commits in the last 6 months.
Use this if you are a deep learning researcher or practitioner using Keras and need to implement or replicate advanced 'attention' mechanisms from state-of-the-art research papers.
Not ideal if you are not working with Keras or if your deep learning tasks do not require specialized attention or normalization layers beyond what Keras offers out-of-the-box.
Stars
62
Forks
15
Language
Python
License
MIT
Category
Last pushed
May 25, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/zimmerrol/keras-utility-layer-collection"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
localminimum/QANet
A Tensorflow implementation of QANet for machine reading comprehension
carpedm20/MemN2N-tensorflow
"End-To-End Memory Networks" in Tensorflow
HKUST-KnowComp/R-Net
Tensorflow Implementation of R-Net
domluna/memn2n
End-To-End Memory Network using Tensorflow
allenai/bi-att-flow
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that...