FedGC: Federated Learning on Non-IID Data via Learning from Good Clients

Published: 01 Jan 2024, Last Modified: 01 Jul 2025PRCV (1) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Federated learning (FL) is a privacy-preserving solution for deep learning with decentralized data owners. An important issue that may degrade the performance of FL is statistical heterogeneity among data distributions of data owners (clients). That is, data of different clients are non-independently and identically distributed (non-IID) so that client local objective functions are inconsistent. To cope with this issue, we reveal that the unbiased client selection strategy is not optimal for FL on non-IID data. Motivated by this observation, we propose a new method named FedGC for solving data heterogeneity problem, which tends to select clients with better-performed models. With the proposed FedGC, the negative impact of inconsistent local updates on performance of global model is alleviated by learning the optimization directions of selected clients. On the other hand, all clients may learn from the selected clients in local training phase to reduce inconsistency in client local updates and increase consistency between local models and the global one. The experimental results on several benchmarks under various non-IIDness settings show that our proposed FedGC scheme generally outperforms the state-of-the-art methods and can serve as a useful plugin for enhancing the performance of FL methods.
Loading