ZigaSajovic/dCpp
Automatic differentiation in C++; infinite differentiability of conditionals, loops, recursion and all things C++
This is a C++ library for automatic differentiation. It takes your existing C++ code that uses 'double' types and, by changing them to 'var' types, allows you to automatically calculate derivatives of your functions. This is useful for C++ developers who need to compute gradients for optimization, sensitivity analysis, or advanced mathematical modeling within their programs.
151 stars. No commits in the last 6 months.
Use this if you are a C++ programmer and need to efficiently calculate derivatives (including higher-order derivatives) of complex C++ functions, especially those involving conditionals, loops, and recursion, without writing manual symbolic or numerical differentiation code.
Not ideal if you are not a C++ programmer or if your primary need is for a general-purpose scientific computing library that happens to include automatic differentiation (e.g., Python-based machine learning frameworks).
Stars
151
Forks
27
Language
C++
License
—
Category
Last pushed
Apr 05, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ZigaSajovic/dCpp"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
JonathanRaiman/theano_lstm
:microscope: Nano size Theano LSTM module
google/tangent
Source-to-Source Debuggable Derivatives in Pure Python
ahrefs/ocannl
OCANNL: OCaml Compiles Algorithms for Neural Networks Learning
yoshoku/numo-openblas
Numo::OpenBLAS builds and uses OpenBLAS as a background library for Numo::Linalg
statusfailed/catgrad
a categorical deep learning compiler