olivia-ai/the-math-behind-a-neural-network
Mathematics paper recapitulating the calculus behind a neural network and its back propagation
This document helps you understand the detailed mathematical operations that power artificial neural networks, from how they process information to how they learn from errors. It explains the calculus behind concepts like forward propagation and backpropagation. Students, researchers, and AI practitioners looking to deepen their foundational knowledge of neural network mechanics would find this useful.
132 stars. No commits in the last 6 months.
Use this if you need a thorough, step-by-step explanation of the mathematical underpinnings of neural networks, including the specific calculus involved.
Not ideal if you're looking for a coding tutorial, a ready-to-use library, or a high-level overview without mathematical detail.
Stars
132
Forks
9
Language
TeX
License
MIT
Category
Last pushed
May 21, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/olivia-ai/the-math-behind-a-neural-network"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
digantamisra98/Mish
Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
Sentdex/nnfs_book
Sample code from the Neural Networks from Scratch book.
itdxer/neupy
NeuPy is a Tensorflow based python library for prototyping and building neural networks
vzhou842/cnn-from-scratch
A Convolutional Neural Network implemented from scratch (using only numpy) in Python.
nicklashansen/rnn_lstm_from_scratch
How to build RNNs and LSTMs from scratch with NumPy.