diaoenmao/SemiFL-Semi-Supervised-Federated-Learning-for-Unlabeled-Clients-with-Alternate-Training

[NeurIPS 2022] SemiFL: Semi-Supervised Federated Learning for Unlabeled Clients with Alternate Training

42
/ 100
Emerging

This project helps machine learning researchers and data scientists improve the performance of their models, especially when they have a central labeled dataset but also access to many distributed, unlabeled datasets. It takes in a trained model on a server with labeled data and combines it with unlabeled data from various client sources, producing a more robust and accurate machine learning model without needing to share sensitive client data. This is ideal for research labs or organizations dealing with privacy-sensitive data across different entities.

No commits in the last 6 months.

Use this if you need to train high-performing machine learning models using both a small labeled dataset and a large, distributed collection of unlabeled data, while respecting data privacy.

Not ideal if all your data is centrally located and labeled, or if you don't have access to distributed, unlabeled datasets.

federated-learning semi-supervised-learning distributed-machine-learning privacy-preserving-ai machine-learning-research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 18 / 25

How are scores calculated?

Stars

42

Forks

13

Language

Python

License

MIT

Last pushed

Jul 19, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/diaoenmao/SemiFL-Semi-Supervised-Federated-Learning-for-Unlabeled-Clients-with-Alternate-Training"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.