mstksg/backprop

Heterogeneous automatic differentiation ("backpropagation") in Haskell

43
/ 100
Emerging

This is a Haskell library for numerical model training, allowing you to automatically calculate gradients for complex functions. You define your model's computations, and it produces functions for computing gradients, even when intermediate steps involve different data types like matrices, vectors, or scalars. It's designed for developers building machine learning models or other numerical optimization systems in Haskell.

193 stars. No commits in the last 6 months.

Use this if you are a Haskell developer building deep learning models or other numerical optimization systems and need to automatically compute gradients for heterogeneous computations.

Not ideal if you are not a Haskell developer or if your computations involve only homogeneous data types, as simpler automatic differentiation libraries might suffice.

deep-learning numerical-optimization machine-learning-engineering gradient-descent haskell-development
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

193

Forks

22

Language

Haskell

License

BSD-3-Clause

Last pushed

Jun 05, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/mstksg/backprop"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.