FedGosp: A Novel Framework of Gossip Federated Learning for Data HeterogeneityDownload PDFOpen Website

Published: 01 Jan 2022, Last Modified: 12 Nov 2023SMC 2022Readers: Everyone
Abstract: Federated learning (FL) provides the possibility to solve the problem of data privacy, but it suffers much from the data heterogeneity among different participants. Currently, some promising FL algorithms improve the effectiveness of learning under the non independent-and-identically-distributed (Non-IID) data settings. However, they require a large number of communication rounds between the server and clients for an acceptable accuracy. Inspired by the training paradigm of gossip learning, this paper proposes a new FL framework, named FedGosp. It first classifies the clients into different categories based on the model weights trained by the locally stored data. Then FedGosp utilizes the communication not only between clients and the server, but also between different classes of clients themselves. This training process enables instilling knowledge about various data distributions in the passed models. We evaluate the performance of FedGosp in multiple Non-IID settings on CIFAR10 and MNIST datasets, and compare it with the recently popular algorithms such as SCAFFOLD, FedAvg and FedProx. The experimental results show that FedGosp can improve the model accuracy by 6.53% and save 5.6 × communication costs at best compared to the second-ranked baseline.
0 Replies

Loading