Federated Learning from Only Unlabeled Data with Class-conditional-sharing ClientsDownload PDF

29 Sept 2021, 00:32 (modified: 11 May 2022, 17:28)ICLR 2022 PosterReaders: Everyone
Keywords: unsupervised federated learning, unlabeled data, class prior shift
Abstract: Supervised federated learning (FL) enables multiple clients to share the trained model without sharing their labeled data. However, potential clients might even be reluctant to label their own data, which could limit the applicability of FL in practice. In this paper, we show the possibility of unsupervised FL whose model is still a classifier for predicting class labels, if the class-prior probabilities are shifted while the class-conditional distributions are shared among the unlabeled data owned by the clients. We propose federation of unsupervised learning (FedUL), where the unlabeled data are transformed into surrogate labeled data for each of the clients, a modified model is trained by supervised FL, and the wanted model is recovered from the modified model. FedUL is a very general solution to unsupervised FL: it is compatible with many supervised FL methods, and the recovery of the wanted model can be theoretically guaranteed as if the data have been labeled. Experiments on benchmark and real-world datasets demonstrate the effectiveness of FedUL. Code is available at https://github.com/lunanbit/FedUL.
One-sentence Summary: Federated learning: no label no cry
26 Replies