AFAgarap/dl-relu
Deep Learning using Rectified Linear Units (ReLU)
This project helps machine learning researchers and practitioners understand the practical differences when choosing activation functions like ReLU for deep neural networks. It takes various datasets (like image and text) and demonstrates how different ReLU variations perform compared to traditional functions in tasks such as classification and reconstruction. The primary user would be someone involved in designing and optimizing neural network architectures.
No commits in the last 6 months.
Use this if you are a machine learning researcher or practitioner deciding which activation function to implement in a deep neural network for tasks like image or text classification.
Not ideal if you are looking for a ready-to-use application or a deep learning framework, rather than an empirical comparison of activation functions.
Stars
23
Forks
3
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Aug 02, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AFAgarap/dl-relu"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
digantamisra98/Mish
Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
Sentdex/nnfs_book
Sample code from the Neural Networks from Scratch book.
itdxer/neupy
NeuPy is a Tensorflow based python library for prototyping and building neural networks
vzhou842/cnn-from-scratch
A Convolutional Neural Network implemented from scratch (using only numpy) in Python.
nicklashansen/rnn_lstm_from_scratch
How to build RNNs and LSTMs from scratch with NumPy.