privacytrustlab/ml_privacy_meter

Privacy Meter: An open-source library to audit data privacy in statistical and machine learning algorithms.

53
/ 100
Established

This tool helps data privacy officers and risk managers assess how much private information might be leaking from machine learning models. You provide your trained AI models and the data used to train them, and it generates reports quantifying the privacy risks, such as whether specific individuals' data can be identified. This is for professionals responsible for ensuring AI systems comply with privacy regulations.

703 stars. No commits in the last 6 months.

Use this if you need to quantitatively evaluate and verify the privacy of individuals' data used in your AI systems, especially for compliance with regulations like GDPR.

Not ideal if you are looking for a tool to anonymize your data or to develop privacy-preserving machine learning models directly, as this focuses on auditing existing models.

data-privacy compliance-auditing AI-governance risk-assessment machine-learning-security
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 25 / 25

How are scores calculated?

Stars

703

Forks

150

Language

Jupyter Notebook

License

MIT

Last pushed

Apr 26, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/privacytrustlab/ml_privacy_meter"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.