evanatyourservice/kron_torch
An implementation of PSGD Kron second-order optimizer for PyTorch
This tool helps machine learning engineers and researchers efficiently train deep learning models using PyTorch. By providing an optimized algorithm (PSGD Kron), it takes your model's parameters and loss function, and outputs an improved model that converges faster and generalizes better than with standard optimizers. It's designed for practitioners who build and train neural networks.
No commits in the last 6 months. Available on PyPI.
Use this if you are training a PyTorch deep learning model and want to improve training speed, convergence, and the final model's generalization without extensive hyperparameter tuning.
Not ideal if you are not working with PyTorch models or are looking for a basic, first-order optimization algorithm.
Stars
98
Forks
6
Language
Python
License
CC-BY-4.0
Category
Last pushed
Jul 24, 2025
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/evanatyourservice/kron_torch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)