raskr/rust-autograd
Tensors and differentiable operations (like TensorFlow) in Rust
This project helps machine learning engineers and researchers build and train neural networks or other complex models where understanding how changes in inputs affect outputs is critical. You provide your model's mathematical definition and input data, and it computes the partial derivatives of your results with respect to those inputs. This allows you to perform tasks like optimizing model parameters or understanding feature importance.
500 stars. No commits in the last 6 months.
Use this if you are a Rust developer building machine learning models or algorithms that require automatic differentiation to calculate gradients for tasks like optimization or sensitivity analysis.
Not ideal if you are looking for a high-level, off-the-shelf machine learning framework with pre-built models and simple APIs for standard tasks.
Stars
500
Forks
38
Language
Rust
License
MIT
Category
Last pushed
Feb 11, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/raskr/rust-autograd"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
EnzymeAD/Enzyme
High-performance automatic differentiation of LLVM and MLIR.
Oxen-AI/Oxen
Lightning fast data version control system for structured and unstructured machine learning...
LaurentMazare/tch-rs
Rust bindings for the C++ api of PyTorch.
SunDoge/dlpark
A Rust Library for High-Performance Tensor Exchange with Python
TheMesocarp/koho
Full spectrum sheaf neural network over arbitrary CW complexes.