EricLoong/feddip
The official code for ICDM2023 paper: ' FedDIP: Federated Learning with Extreme Dynamic Pruning and Incremental Regularization'
This project helps machine learning engineers improve the efficiency of federated learning models. It takes distributed datasets and model architectures like AlexNet or ResNet, then applies dynamic pruning and incremental regularization to produce a more lightweight yet accurate global model. Data scientists and ML researchers working with privacy-sensitive or large-scale distributed data would use this.
No commits in the last 6 months.
Use this if you are developing or experimenting with federated learning systems and need to optimize model size and training performance without sacrificing accuracy, especially for image classification tasks.
Not ideal if you are looking for a general-purpose federated learning framework that does not prioritize model compression or pruning.
Stars
14
Forks
—
Language
Python
License
MIT
Category
Last pushed
Aug 16, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/EricLoong/feddip"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
flwrlabs/flower
Flower: A Friendly Federated AI Framework
JonasGeiping/breaching
Breaching privacy in federated learning scenarios for vision and text
anupamkliv/FedERA
FedERA is a modular and fully customizable open-source FL framework, aiming to address these...
zama-ai/concrete-ml
Concrete ML: Privacy Preserving ML framework using Fully Homomorphic Encryption (FHE), built on...
p2pfl/p2pfl
P2PFL is a decentralized federated learning library that enables federated learning on...