Keywords: federated learning, CLIP, unsupervised learning
Abstract: Federated learning facilitates collaborative model training across multiple distributed clients without requiring data sharing.
However, conventional federated methods struggle with classification tasks in an unsupervised paradigm due to the absence of category knowledge.
Recently, CLIP, a prominent visual language model, has demonstrated impressive results, particularly its remarkable zero-shot classification ability, which alleviates the dependence on labeled data.
In this paper, we first explore a new realistic problem, unsupervised federated learning using CLIP, where clients with unlabeled heterogeneous data collaborate to enhance global performance.
To address this problem, we propose FedPP, a method that incorporates a cooperative pseudo-label selection strategy and a partial prompt aggregation protocol.
Our selection strategy ensures that all classes are trained in a balanced manner through global pseudo-label allocation.
Concurrently, the aggregation protocol divides parameters into aggregated and retained components to optimize global performance while supporting local personalization.
Extensive experiments across six datasets with various types of heterogeneity demonstrate the effectiveness of FedPP.
Our code is available in the supplementary materials.
Supplementary Material: zip
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7229
Loading