ProxyFL: A Proxy-Guided Framework for Federated Semi-Supervised Learning

ICLR 2026 Conference Submission269 Authors

01 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated Semi-Supervised Learning; Federated Learning
Abstract: Federated Semi-Supervised Learning (FSSL) aims to collaboratively train a global model by leveraging unlabeled data and limited labeled data across clients in a privacy-preserving manner. In FSSL, data heterogeneity is a challenging issue, which exists both across clients (external heterogeneity) and within clients (internal heterogeneity). Most FSSL methods typically design fixed or dynamic weight aggregation strategy on the server (for external) or filter out low-confidence unlabeled samples directly by an empirical threshold to reduce mistakes in local client (for internal). But, the former is hard to precisely fit the real global category distribution due to external heterogeneity, and the latter results in fewer training participation of available samples in FL. To address these issues, we propose a proxy-guided framework called ProxyFL that focuses on simultaneously mitigating external and internal heterogeneity via a unified proxy. \emph{I.e.}, we consider the learnable weights of classifier as proxy to simulate the category distribution both locally and globally. For external, we explicitly optimize global proxy to better fit the category distribution across clients; for internal, we include the discarded samples together with other samples into training based upon a positive-negative proxy pool {without compromising wrong pseudo-labels.} Insight experiments \& theoretical analysis show that ProxyFL significantly boost the FSSL performance and convergence.
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 269
Loading