Keywords: Knowledge Distillation, Expert-Guided Fusion, Heterogeneous Federated Learning
Abstract: Heterogeneous Federated Learning enables collaborative training across devices with diverse architectures and non-IID data. However, it often struggles with effective knowledge fusion, leading to the loss of personalized knowledge during aggregation and potentially exacerbating client model divergence due to globally guided updates misaligned with local data or architectures. To tackle the challenges, we propose FedFuse, a novel framework centered on adaptive, personalized knowledge fusion via logits. FedFuse introduces a server-side Expert-guided Fusion mechanism that uniquely facilitates adaptive knowledge fusion by dynamically gating and weighting heterogeneous client knowledge contributions, moving beyond prior static schemes. Complementarily, an elaborately designed selective knowledge distillation strategy allows clients to assimilate global knowledge without blind imitation, thereby preserving crucial local model features and mitigating detrimental model divergence. We provide rigorous convergence analysis for Fed-
Fuse under heterogeneity. Extensive experiments, including up to 500 clients, diverse heterogeneity settings, and ablation studies validating the necessity of both core components, demonstrate our superiority. FedFuse significantly outperforms
state-of-the-art methods in test accuracy, particularly under high heterogeneity, while exhibiting competitive communication and computational efficiency.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 16952
Loading