SemiFL: Communication Efficient Semi-Supervised Federated Learning with Unlabeled ClientsDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Federated Learning, Semi-Supervised Learning, Strong data augmentation, Unlabeled data
Abstract: Federated Learning allows training machine learning models by using the computation and private data resources of many distributed clients such as smartphones and IoT devices. Most existing works on Federated Learning (FL) assume the clients have ground-truth labels. However, in many practical scenarios, clients may be unable to label task-specific data, e.g., due to a lack of expertise. This work considers a server that hosts a labeled dataset and wishes to leverage clients with unlabeled data for supervised learning. We propose a new Federated Learning framework referred to as SemiFL to address Semi-Supervised Federated Learning (SSFL). In SemiFL, clients have completely unlabeled data, while the server has a small amount of labeled data. SemiFL is communication efficient since it separates the training of server-side supervised data and client-side unsupervised data. We demonstrate several strategies of SemiFL that enhance efficiency and prediction and develop intuitions of why they work. In particular, we provide a theoretical understanding of the use of strong data augmentation for Semi-Supervised Learning (SSL), which can be interesting in its own right. Extensive empirical evaluations demonstrate that our communication efficient method can significantly improve the performance of a labeled server with unlabeled clients. Moreover, we demonstrate that SemiFL can outperform many existing FL results trained with fully supervised data, and perform competitively with the state-of-the-art centralized SSL methods. For instance, in standard communication efficient scenarios, our method can perform $93\%$ accuracy on the CIFAR10 dataset with only $4000$ labeled samples at the server. Such accuracy is only $2\%$ away from the result trained from $50000$ fully labeled data, and it improves about $30\%$ upon existing SSFL methods in the communication efficient setting.
One-sentence Summary: We propose a new Federated Learning framework referred to as SemiFL to address the problem of Semi-Supervised Federated Learning (SSFL) for clients with completely unlabeled data, while the server has a small amount of labeled data.
22 Replies

Loading