Lee-Gihun/FedNTD

(NeurIPS 2022) Official Implementation of "Preservation of the Global Knowledge by Not-True Distillation in Federated Learning"

41
/ 100
Emerging

This is a framework for researchers and practitioners working on federated learning. It allows you to experiment with various federated learning algorithms, including FedNTD, to train machine learning models collaboratively without centralizing data. You input datasets like MNIST or CIFAR-10 and configurations for federated learning, and it outputs trained models and experiment logs. It's designed for those evaluating or implementing distributed machine learning solutions.

No commits in the last 6 months.

Use this if you are a researcher or engineer in machine learning who needs to compare or develop federated learning algorithms for distributed model training.

Not ideal if you are looking for a pre-packaged, production-ready federated learning solution for immediate deployment without deep algorithm experimentation.

federated-learning distributed-machine-learning privacy-preserving-ai model-training machine-learning-research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

90

Forks

14

Language

Python

License

MIT

Last pushed

Feb 24, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Lee-Gihun/FedNTD"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.