Epistimio/orion
Asynchronous Distributed Hyperparameter Optimization.
This tool helps machine learning researchers and practitioners efficiently find the best settings (hyperparameters) for their models or training processes. You provide your existing model training script, and it intelligently explores different configurations to output optimized hyperparameters and improved model performance. It's designed for anyone regularly experimenting with and optimizing machine learning models.
301 stars. Available on PyPI.
Use this if you need to automatically and efficiently tune the settings of your machine learning models, especially across multiple experiments or computing resources.
Not ideal if you are looking for a general-purpose optimization library outside of the realm of machine learning model tuning.
Stars
301
Forks
51
Language
Python
License
—
Category
Last pushed
Nov 26, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Epistimio/orion"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related frameworks
optuna/optuna
A hyperparameter optimization framework
keras-team/keras-tuner
A Hyperparameter Tuning Library for Keras
KernelTuner/kernel_tuner
Kernel Tuner
syne-tune/syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
deephyper/deephyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning