Learning Locally, Revising Globally: Global Reviser for Federated Learning with Noisy Labels

18 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated Learning, Learning with Noisy Labels
TL;DR: This study firstly observes an unrecognized phenomenon that the global model of FL exhibits slow memorization of noisy labels and propose FedGR to improve the label-noise robustness of FL based on this.
Abstract: In pursuit of data privacy, federated learning (FL) collaboratively trains a global model by aggregating local models learned from decentralized data. However, FL heavily depends on high-quality labels, which are often impractical in the real world, leading to the federated label-noise (F-LN) problem. Unlike traditional noisy labels, F-LN problem is exacerbated by the inherent heterogeneity of FL, where clients experience varying levels and types of label errors. In this study, we observe that the global model of FL exhibits slow memorization of noisy labels, suggesting its ability to maintain reliable predictions and robust representations in FL. Based on this insight, we propose a novel method termed Global Reviser for Federated Learning with Noisy Labels (FedGR) to improve the robustness of FL against F-LN problem. Specifically, FedGR first leverages the label-noise-robust characteristics of the global model to filter and refine the noisy labels on each client using the sieving-and-refining module. Then, it regularizes local model training with the assistance of the global model through following two modules: the globally revised exponential moving average (EMA) distillation module and the global representation regularization module. Extensive experiments on three widely used F-LN benchmarks demonstrate the superior performance of FedGR, outperforming seven state-of-the-art baselines even in complicated label-noise and data heterogeneity. The code will be released upon acceptance.
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 10711
Loading