100/Solid
🎯 A comprehensive gradient-free optimization framework written in Python
This framework helps developers solve optimization problems where traditional calculus-based methods aren't suitable or efficient. You define a problem, input your custom objective function, and the framework outputs the best possible solution found by various gradient-free algorithms. It's designed for software developers, data scientists, and researchers who implement optimization routines in their applications.
584 stars. No commits in the last 6 months.
Use this if you need to find optimal solutions for complex problems without relying on gradient calculations, and you are comfortable with Python programming to define your specific problem and objective function.
Not ideal if you're looking for a low-code or no-code tool, or if your optimization problem can be efficiently solved using gradient-based methods.
Stars
584
Forks
59
Language
Python
License
MIT
Category
Last pushed
Jul 19, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/100/Solid"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)