xadrianzetx/optuna-distributed

Distributed hyperparameter optimization made easy

35
/ 100
Emerging

This tool helps machine learning engineers and data scientists efficiently find the best settings (hyperparameters) for their models. You provide your model's training code and the ranges for its settings, and the tool outputs the optimal hyperparameter combination. It's designed for anyone regularly tuning complex machine learning models.

No commits in the last 6 months. Available on PyPI.

Use this if you need to speed up hyperparameter optimization for your machine learning models by running many trials in parallel, either on your local machine or across a cluster.

Not ideal if you require advanced Optuna features like callbacks, specific integration modules, or need to run local asynchronous optimization on a Windows machine, as these are not fully supported yet.

machine-learning-engineering data-science model-tuning hyperparameter-optimization distributed-computing
Stale 6m
Maintenance 0 / 25
Adoption 7 / 25
Maturity 25 / 25
Community 3 / 25

How are scores calculated?

Stars

38

Forks

1

Language

Python

License

MIT

Last pushed

Jun 11, 2024

Commits (30d)

0

Dependencies

3

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/xadrianzetx/optuna-distributed"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.