sherpa-ai/sherpa
Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.
Sherpa helps machine learning researchers find the best settings for their models. You provide your machine learning model and a range of potential values for its parameters, and Sherpa automatically tests different combinations to identify the ones that yield the best performance. This is for data scientists or ML engineers who need to fine-tune their algorithms for optimal results.
345 stars. No commits in the last 6 months.
Use this if you are building machine learning models and need an efficient way to automatically find the best hyperparameters (settings) without extensive manual experimentation.
Not ideal if you are not working with machine learning models or primarily need to optimize parameters for general software applications outside of ML contexts.
Stars
345
Forks
52
Language
JavaScript
License
GPL-3.0
Category
Last pushed
Oct 18, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/sherpa-ai/sherpa"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
optuna/optuna
A hyperparameter optimization framework
keras-team/keras-tuner
A Hyperparameter Tuning Library for Keras
KernelTuner/kernel_tuner
Kernel Tuner
syne-tune/syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
deephyper/deephyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning