ARM-software/mango
Parallel Hyperparameter Tuning in Python
This tool helps machine learning engineers and data scientists efficiently find the best settings (hyperparameters) for their machine learning models, like KNeighbors, SVM, or XGBoost classifiers. It takes in a defined range of possible parameter values for your model and outputs the optimal combination of these parameters to achieve the best model performance. This process is accelerated by running many trials in parallel, saving valuable time.
418 stars.
Use this if you need to optimize the performance of your machine learning models by systematically searching through many potential hyperparameter combinations, especially when dealing with complex search spaces or needing to leverage parallel computing.
Not ideal if you are looking for a simple, manual way to adjust model parameters or if you are not working with machine learning models that require hyperparameter tuning.
Stars
418
Forks
47
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Mar 11, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ARM-software/mango"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
optuna/optuna
A hyperparameter optimization framework
keras-team/keras-tuner
A Hyperparameter Tuning Library for Keras
KernelTuner/kernel_tuner
Kernel Tuner
syne-tune/syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
deephyper/deephyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning