google-research/hyperbo
Pre-trained Gaussian processes for Bayesian optimization
For machine learning engineers or researchers, this helps to efficiently optimize the hyperparameters of deep learning models. It takes in various model architectures and training configurations, and outputs optimized hyperparameters that lead to better model performance. You would use this if you are developing or fine-tuning deep learning models and need to find the best settings.
100 stars.
Use this if you are a machine learning practitioner looking to accelerate the tuning of deep learning model parameters.
Not ideal if you are a non-technical user or if your optimization problem does not involve deep learning hyperparameters.
Stars
100
Forks
8
Language
Python
License
Apache-2.0
Category
Last pushed
Jan 09, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/google-research/hyperbo"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SimonBlanke/Gradient-Free-Optimizers
Lightweight optimization with local, global, population-based and sequential techniques across...
Gurobi/gurobi-machinelearning
Formulate trained predictors in Gurobi models
emdgroup/baybe
Bayesian Optimization and Design of Experiments
heal-research/pyoperon
Python bindings and scikit-learn interface for the Operon library for symbolic regression.
simon-hirsch/ondil
A package for online distributional learning.