nlesc-dirac/pytorch
Improved LBFGS and LBFGS-B optimizers in PyTorch.
This provides advanced optimization algorithms (LBFGS and LBFGS-B) for training deep learning models more efficiently. It takes your existing PyTorch neural network model and helps it learn faster or converge to a better solution than standard optimizers. This is for machine learning researchers, data scientists, and engineers who build and train deep learning models for tasks like image classification, scientific inverse problems, or clustering.
Use this if you are training deep neural networks in PyTorch and need more efficient convergence, especially for problems where traditional optimizers like Adam are too slow or get stuck.
Not ideal if you are working with extremely large batch sizes or require very simple, fast-to-implement optimizers that don't need fine-tuning.
Stars
67
Forks
6
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 19, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/nlesc-dirac/pytorch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)