tsoernes/gfsopt
Convenient hyperparameter optimization
This tool helps machine learning engineers and data scientists efficiently find the best settings (hyperparameters) for their models or algorithms. You input your machine learning model and a range of potential settings, and it systematically explores these options, even running multiple tests in parallel. The output is a set of optimized hyperparameters that make your model perform best, saving you time and computational resources.
No commits in the last 6 months. Available on PyPI.
Use this if you need to fine-tune the parameters of a machine learning model or any complex algorithm to achieve optimal performance, especially when dealing with stochastic outcomes that require averaging multiple runs.
Not ideal if your optimization problem is simple, involves only a few parameters, or does not require advanced global search techniques.
Stars
14
Forks
—
Language
Python
License
MIT
Category
Last pushed
Apr 30, 2024
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/tsoernes/gfsopt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SimonBlanke/Gradient-Free-Optimizers
Lightweight optimization with local, global, population-based and sequential techniques across...
Gurobi/gurobi-machinelearning
Formulate trained predictors in Gurobi models
emdgroup/baybe
Bayesian Optimization and Design of Experiments
heal-research/pyoperon
Python bindings and scikit-learn interface for the Operon library for symbolic regression.
simon-hirsch/ondil
A package for online distributional learning.