Abstract: Clustered federated learning (CFL) addresses data heterogeneity in federated learning by grouping clients with similar data distributions into independent clusters, thus overcoming the limitations of a single global model that cannot meet the personalized needs of all devices. However, existing clustering methods face challenges due to the unknown number of clusters, which may not truly reflect the data distribution. Besides, most clustering methods focus on enabling collaboration among clients within the same cluster, without considering cooperation between similar clusters, which limits the scope of knowledge dissemination. To address these issues, we propose a fine-grained clustering framework (FGCFL) that introduces a benchmark distribution and refines cluster divisions by aggregating clients located between similar distribution clusters. Based on the refined clusters, self-attention is employed to facilitate collaboration among them, promoting inductive knowledge transfer between similar clusters. Experimental results on several datasets show that our proposed method considerably increases model performance under heterogeneity scenarios.
External IDs:dblp:journals/tjs/WangXYBYXB25
Loading