Clustered Federated Learning for Heterogeneous Feature Spaces using Siamese Graph Convolutional Neural Network Distance PredictionDownload PDF

Published: 16 May 2023, Last Modified: 02 Jul 2023FLSys 2023Readers: Everyone
Keywords: Federated Learning, Graph Neural Network, Clustered Federated Learning
TL;DR: Clustered federated learning utilizing graph neural networks to predict clients' neural networks similarity to maximize positive knowledge transfer even with feature space heterogeneity
Abstract: Federated learning (FL) has been proposed to enhance performance of local machine learning models across multiple devices while maintaining data privacy. One of the main challenges in FL is data heterogeneity, which limits the benefits of knowledge exchange among federated models. Data heterogeneity is a two-fold problem: Non-IID (not independent and identically distributed) data, and heterogeneous feature space. While personalized federated learning (PFL) solutions address challenges with non-IID data, most of the solutions require the same feature space. The few PFL solutions that are applicable in feature heterogeneity scenarios map all local domains into a new common feature space, which may result in degraded performance because of the negative transfer effects from unrelated local models based on a very different feature space. To address this limitation, we propose a novel clustered federated learning based on a Siamese graph convolutional neural network (FedSGCNN). We predict positive transfer between clients using an SGCNN in order to create a distance matrix for clustering. This network-based prediction is more accurate as compared to other distance measures which fail to capture the structure of models. When there is feature space heterogeneity, we show that FedSGCNN outperforms the latest work by 1.2% in the Boston Housing dataset and 2.8% in the Obesity Level dataset.
0 Replies

Loading