kenziyuliu/DP2
[ICLR 2023] Official JAX/Haiku implementation of the paper "Differentially Private Adaptive Optimization with Delayed Preconditioners"
This project helps machine learning researchers and practitioners develop and train models while ensuring user data privacy. It takes your dataset and model architecture as input, applying specialized optimization techniques to produce a trained model that balances high performance with strong privacy guarantees. Researchers focused on privacy-preserving machine learning will find this particularly useful.
No commits in the last 6 months.
Use this if you need to train machine learning models on sensitive data and want to achieve better model accuracy while strictly adhering to differential privacy standards.
Not ideal if your primary goal is model training without privacy constraints, or if you are looking for a plug-and-play solution without understanding privacy-preserving optimization methods.
Stars
16
Forks
—
Language
Python
License
Apache-2.0
Category
Last pushed
Dec 14, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/kenziyuliu/DP2"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
meta-pytorch/opacus
Training PyTorch models with differential privacy
tensorflow/privacy
Library for training machine learning models with privacy for training data
tf-encrypted/tf-encrypted
A Framework for Encrypted Machine Learning in TensorFlow
awslabs/fast-differential-privacy
Fast, memory-efficient, scalable optimization of deep learning with differential privacy
privacytrustlab/ml_privacy_meter
Privacy Meter: An open-source library to audit data privacy in statistical and machine learning...