Keywords: Knowledge Distillation, Federated Learning, Data Heterogeneity, Model Heterogeneity
TL;DR: A bidirectional knowledge exchange technique for heterogeneous federated learning.
Abstract: Heterogeneous Federated Learning (HFL) is a decentralized machine learning paradigm that enables participants to leverage distributed knowledge from diversified environments while safeguarding individual privacy. Recent works that address both data and model heterogeneity still require aggregating model parameters, which restricts architectural flexibility. Knowledge Distillation (KD) has been adopted in HFL to circumvent direct model aggregation by aggregating knowledge, but it depends on a public dataset and may incur information loss when redistributing knowledge from the global model. We propose Federated Knowledge Exchange (FKE), an aggregation-free FL paradigm in which each client acts as both teacher and student, exchanging knowledge directly with peers and removing the need for a global model. To remove reliance on public data, we attach a lightweight embedding decoder that produces transfer data, forming the Data-Free Federated Knowledge Exchange (DFFKE) framework. Extensive experiments show that DFFKE surpasses nine state-of-the-art HFL baselines by up to 18.14%. Code is available in the supplementary material. Anonymous Repo: https://anonymous.4open.science/r/DFFKE-0E0B.
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 1144
Loading