FedDiv: Collaborative Noise Filtering for Federated Learning with Noisy Labels

Published: 31 Dec 2023, Last Modified: 05 Mar 2025The Thirty-Eighth AAAI Conference on Artificial Intelligence (AAAI-24)EveryoneRevisionsCC BY 4.0
Abstract: Federated Learning with Noisy Labels (F-LNL) aims at seek- ing an optimal server model via collaborative distributed learning by aggregating multiple client models trained with local noisy or clean samples. On the basis of a federated learning framework, recent advances primarily adopt label noise filtering to separate clean samples from noisy ones on each client, thereby mitigating the negative impact of label noise. However, these prior methods do not learn noise fil- ters by exploiting knowledge across all clients, leading to sub-optimal and inferior noise filtering performance and thus damaging training stability. In this paper, we present FedDiv to tackle the challenges of F-LNL. Specifically, we propose a global noise filter called Federated Noise Filter for effec- tively identifying samples with noisy labels on every client, thereby raising stability during local training sessions. With- out sacrificing data privacy, this is achieved by modeling the global distribution of label noise across all clients. Then, in an effort to make the global model achieve higher perfor- mance, we introduce a Predictive Consistency based Sampler to identify more credible local data for local model train- ing, thus preventing noise memorization and further boost- ing the training stability. Extensive experiments on CIFAR- 10, CIFAR-100, and Clothing1M demonstrate that FedDiv achieves superior performance over state-of-the-art F-LNL methods under different label noise settings for both IID and non-IID data partitions.
Loading