young-geng/mlxu
Machine Learning eXperiment Utilities
This tool helps machine learning engineers streamline their experimentation workflow. It takes in Python code for models and training loops, along with desired configuration settings like learning rates or network architectures. It outputs organized experiment logs, command-line interfaces for easy configuration, and simplified random number generation for JAX models, making complex model development more manageable.
No commits in the last 6 months. Available on PyPI.
Use this if you are a machine learning engineer working with JAX, needing a more efficient way to manage experiment configurations, command-line arguments, and integrate with Weights & Biases for logging.
Not ideal if you are not a Python developer or are working on projects outside of machine learning experimentation, especially those not involving JAX.
Stars
48
Forks
9
Language
Python
License
MIT
Category
Last pushed
Jul 29, 2025
Commits (30d)
0
Dependencies
6
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/young-geng/mlxu"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
optuna/optuna
A hyperparameter optimization framework
keras-team/keras-tuner
A Hyperparameter Tuning Library for Keras
KernelTuner/kernel_tuner
Kernel Tuner
syne-tune/syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
deephyper/deephyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning