mstksg/backprop
Heterogeneous automatic differentiation ("backpropagation") in Haskell
This is a Haskell library for numerical model training, allowing you to automatically calculate gradients for complex functions. You define your model's computations, and it produces functions for computing gradients, even when intermediate steps involve different data types like matrices, vectors, or scalars. It's designed for developers building machine learning models or other numerical optimization systems in Haskell.
193 stars. No commits in the last 6 months.
Use this if you are a Haskell developer building deep learning models or other numerical optimization systems and need to automatically compute gradients for heterogeneous computations.
Not ideal if you are not a Haskell developer or if your computations involve only homogeneous data types, as simpler automatic differentiation libraries might suffice.
Stars
193
Forks
22
Language
Haskell
License
BSD-3-Clause
Category
Last pushed
Jun 05, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/mstksg/backprop"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
JonathanRaiman/theano_lstm
:microscope: Nano size Theano LSTM module
google/tangent
Source-to-Source Debuggable Derivatives in Pure Python
ahrefs/ocannl
OCANNL: OCaml Compiles Algorithms for Neural Networks Learning
yoshoku/numo-openblas
Numo::OpenBLAS builds and uses OpenBLAS as a background library for Numo::Linalg
statusfailed/catgrad
a categorical deep learning compiler