siddharthdivi/Unifying-Distillation-with-Personalization-in-Federated-Learning

Repository that contains the code for the paper titled, 'Unifying Distillation with Personalization in Federated Learning'.

35
/ 100
Emerging

This project helps machine learning researchers or practitioners working with federated learning models. It takes distributed image datasets like CIFAR-10 and MNIST to evaluate various personalized federated learning algorithms, including PersFL-KD, FedAvg, FedPer, pFedMe, and Per-FedAvg. The output consists of experimental results and performance metrics in pickle files, allowing for analysis of different model configurations.

No commits in the last 6 months.

Use this if you are researching or implementing personalized federated learning methods and need to compare their performance on image classification tasks.

Not ideal if you are looking for a ready-to-deploy federated learning solution or a tool for non-research-oriented machine learning applications.

federated-learning distributed-machine-learning image-classification model-personalization machine-learning-research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 14 / 25

How are scores calculated?

Stars

13

Forks

3

Language

Jupyter Notebook

License

MIT

Last pushed

May 31, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/siddharthdivi/Unifying-Distillation-with-Personalization-in-Federated-Learning"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.