lixilinx/psgd_torch

Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation preconditioner and more)

42
/ 100
Emerging

This project provides advanced optimization algorithms for machine learning models, helping researchers and practitioners train models more efficiently. It takes model parameters and gradients as input, and outputs improved parameter updates that accelerate convergence. Data scientists, machine learning engineers, and researchers working on deep learning or large-scale optimization problems would use this.

193 stars.

Use this if you are training complex machine learning models in PyTorch and need a more robust and faster way to converge, especially with large datasets or challenging loss landscapes.

Not ideal if you are a beginner looking for a simple optimizer, as this tool requires a deeper understanding of optimization theory.

deep-learning-optimization machine-learning-research model-training large-scale-ml numerical-optimization
No License No Package No Dependents
Maintenance 13 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 11 / 25

How are scores calculated?

Stars

193

Forks

13

Language

Python

License

Last pushed

Mar 22, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/lixilinx/psgd_torch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.