ChristianInterno/AutoFLIP
Automated Federated Learning via Informed Pruning (https://arxiv.org/abs/2405.10271)
This project helps machine learning engineers and researchers to train deep learning models more efficiently across distributed devices, especially when data isn't uniformly distributed. It takes various local device datasets and training parameters as input, then outputs a highly compressed and efficient global deep learning model. The ideal users are those working on collaborative AI projects where data privacy and computational resources on individual devices are a concern.
No commits in the last 6 months.
Use this if you are building federated learning systems and need to significantly reduce the computational load and improve model performance on diverse, non-uniform datasets.
Not ideal if you are working with a single, centralized dataset or if your federated learning environment consists only of uniformly distributed data.
Stars
9
Forks
—
Language
Python
License
—
Category
Last pushed
Oct 20, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ChristianInterno/AutoFLIP"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
flwrlabs/flower
Flower: A Friendly Federated AI Framework
JonasGeiping/breaching
Breaching privacy in federated learning scenarios for vision and text
zama-ai/concrete-ml
Concrete ML: Privacy Preserving ML framework using Fully Homomorphic Encryption (FHE), built on...
anupamkliv/FedERA
FedERA is a modular and fully customizable open-source FL framework, aiming to address these...
p2pfl/p2pfl
P2PFL is a decentralized federated learning library that enables federated learning on...