SENATOROVAI/L-BFGS-B-solver-course
Linear regression with the LBFGSB (Limited-memory Broyden-Fletcher-Goldfarb-Shanno BFGS) solver method is a numerical optimization method used to find the minimum of an objective function. It is a gradient descent algorithm that uses an approximation of the Hessian matrix to minimize the function.
This course and implementation demystifies the L-BFGS and L-BFGS-B optimization algorithms, which are crucial for finding the minimum of complex mathematical functions efficiently, even with large datasets. It takes you from the core mathematical principles to a production-ready solver. Researchers, machine learning engineers, and students who need to solve large-scale optimization problems will find this valuable.
Use this if you are a researcher or ML engineer who needs to deeply understand and implement a robust, memory-efficient optimization method for large-scale, smooth problems, potentially with box constraints.
Not ideal if you simply need to run a pre-packaged optimization routine without understanding its inner workings or if your problems are non-smooth or require different optimization approaches.
Stars
16
Forks
14
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Mar 11, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/SENATOROVAI/L-BFGS-B-solver-course"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jolars/sortedl1
Python package for Sorted L-One Penalized Estimation (SLOPE)
gugarosa/opytimizer
🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
SENATOROVAI/gradient-descent-sgd-solver-course
Stochastic Gradient Descent (SGD) is an optimization algorithm that updates model parameters...
hiroyuki-kasai/SGDLibrary
MATLAB/Octave library for stochastic optimization algorithms: Version 1.0.20
softmin/ReHLine-python
Regularized Composite ReLU-ReHU Loss Minimization with Linear Computation and Linear Convergence