AmanPriyanshu/DP-HyperparamTuning
DP-HyperparamTuning offers an array of tools for fast and easy hypertuning of various hyperparameters for the DP-SGD algorithm.
This tool helps machine learning engineers and researchers optimize the performance of differentially private deep learning models. It takes your model, datasets, and a defined search space for hyperparameters as input, then efficiently explores different hyperparameter combinations. The output is a set of optimized hyperparameters that improve model performance while maintaining privacy guarantees.
No commits in the last 6 months.
Use this if you are a machine learning practitioner building deep learning models that require differential privacy and need to efficiently find the best hyperparameters to maximize model accuracy or other metrics.
Not ideal if you are working with models that do not require differential privacy or if you need a hyperparameter optimization solution for traditional (non-private) deep learning.
Stars
23
Forks
5
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Sep 27, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AmanPriyanshu/DP-HyperparamTuning"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
optuna/optuna
A hyperparameter optimization framework
keras-team/keras-tuner
A Hyperparameter Tuning Library for Keras
KernelTuner/kernel_tuner
Kernel Tuner
syne-tune/syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
deephyper/deephyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning