ChristianInterno/AutoFLIP

Automated Federated Learning via Informed Pruning (https://arxiv.org/abs/2405.10271)

13
/ 100
Experimental

This project helps machine learning engineers and researchers to train deep learning models more efficiently across distributed devices, especially when data isn't uniformly distributed. It takes various local device datasets and training parameters as input, then outputs a highly compressed and efficient global deep learning model. The ideal users are those working on collaborative AI projects where data privacy and computational resources on individual devices are a concern.

No commits in the last 6 months.

Use this if you are building federated learning systems and need to significantly reduce the computational load and improve model performance on diverse, non-uniform datasets.

Not ideal if you are working with a single, centralized dataset or if your federated learning environment consists only of uniformly distributed data.

federated-learning distributed-machine-learning model-compression deep-learning-optimization privacy-preserving-ai
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 0 / 25

How are scores calculated?

Stars

9

Forks

Language

Python

License

Last pushed

Oct 20, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ChristianInterno/AutoFLIP"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.