keyhankamyar/SpaX
Pythonic, type-safe search space configuration for HPO (hyperparameter optimization), NAS (neural architecture search), and ML experiment tracking. Define complex search spaces with conditional parameters, automatic validation, and zero boilerplate. Pydantic-based, Optuna-ready to nail hyperparameter tuning.
This tool helps machine learning engineers and researchers manage complex model configurations and experiment parameters without tedious boilerplate code. You define your model's possible settings—like optimizer choice or learning rates—and the tool automatically validates them, catches errors early, and helps explore different combinations efficiently. It takes your defined configuration structure and outputs ready-to-use, validated parameter sets for model training and hyperparameter tuning.
Available on PyPI.
Use this if you are an ML engineer or researcher who regularly tunes hyperparameters, performs neural architecture search, or needs to manage intricate, reproducible ML experiment configurations.
Not ideal if you only work with simple, fixed model configurations that don't require extensive searching or conditional parameter logic.
Stars
9
Forks
—
Language
Python
License
MIT
Category
Last pushed
Oct 29, 2025
Commits (30d)
0
Dependencies
1
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/keyhankamyar/SpaX"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
optuna/optuna
A hyperparameter optimization framework
keras-team/keras-tuner
A Hyperparameter Tuning Library for Keras
KernelTuner/kernel_tuner
Kernel Tuner
syne-tune/syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
deephyper/deephyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning