lixilinx/psgd_torch
Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation preconditioner and more)
This project provides advanced optimization algorithms for machine learning models, helping researchers and practitioners train models more efficiently. It takes model parameters and gradients as input, and outputs improved parameter updates that accelerate convergence. Data scientists, machine learning engineers, and researchers working on deep learning or large-scale optimization problems would use this.
193 stars.
Use this if you are training complex machine learning models in PyTorch and need a more robust and faster way to converge, especially with large datasets or challenging loss landscapes.
Not ideal if you are a beginner looking for a simple optimizer, as this tool requires a deeper understanding of optimization theory.
Stars
193
Forks
13
Language
Python
License
—
Category
Last pushed
Mar 22, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/lixilinx/psgd_torch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)