keyhankamyar/SpaX

Pythonic, type-safe search space configuration for HPO (hyperparameter optimization), NAS (neural architecture search), and ML experiment tracking. Define complex search spaces with conditional parameters, automatic validation, and zero boilerplate. Pydantic-based, Optuna-ready to nail hyperparameter tuning.

33
/ 100
Emerging

This tool helps machine learning engineers and researchers manage complex model configurations and experiment parameters without tedious boilerplate code. You define your model's possible settings—like optimizer choice or learning rates—and the tool automatically validates them, catches errors early, and helps explore different combinations efficiently. It takes your defined configuration structure and outputs ready-to-use, validated parameter sets for model training and hyperparameter tuning.

Available on PyPI.

Use this if you are an ML engineer or researcher who regularly tunes hyperparameters, performs neural architecture search, or needs to manage intricate, reproducible ML experiment configurations.

Not ideal if you only work with simple, fixed model configurations that don't require extensive searching or conditional parameter logic.

machine-learning hyperparameter-tuning ML-experimentation model-optimization deep-learning-configuration
Maintenance 6 / 25
Adoption 5 / 25
Maturity 22 / 25
Community 0 / 25

How are scores calculated?

Stars

9

Forks

Language

Python

License

MIT

Last pushed

Oct 29, 2025

Commits (30d)

0

Dependencies

1

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/keyhankamyar/SpaX"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.