tsoernes/gfsopt

Convenient hyperparameter optimization

30
/ 100
Emerging

This tool helps machine learning engineers and data scientists efficiently find the best settings (hyperparameters) for their models or algorithms. You input your machine learning model and a range of potential settings, and it systematically explores these options, even running multiple tests in parallel. The output is a set of optimized hyperparameters that make your model perform best, saving you time and computational resources.

No commits in the last 6 months. Available on PyPI.

Use this if you need to fine-tune the parameters of a machine learning model or any complex algorithm to achieve optimal performance, especially when dealing with stochastic outcomes that require averaging multiple runs.

Not ideal if your optimization problem is simple, involves only a few parameters, or does not require advanced global search techniques.

machine-learning-optimization model-tuning algorithm-parameter-search data-science-workflows
Stale 6m
Maintenance 0 / 25
Adoption 5 / 25
Maturity 25 / 25
Community 0 / 25

How are scores calculated?

Stars

14

Forks

Language

Python

License

MIT

Last pushed

Apr 30, 2024

Commits (30d)

0

Dependencies

3

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/tsoernes/gfsopt"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.