A Heterogeneous Federated Learning Method Based on Dual Teachers Knowledge Distillation

Published: 01 Jan 2024, Last Modified: 16 May 2025ADMA (2) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Recently, Heterogeneous Federated Learning (HtFL) has received increasing attention due to its ability to conduct collaborative learning based on heterogeneous data and models across clients. However, existing HtFL methods typically require the introduction of proxy datasets, auxiliary models, or the exposure of partial local data information, which not only increases the communication overheads but also greatly limits the universality of these methods. In view of this challenge, we propose a novel heterogeneous Federated learning method based on Dual teachers Knowledge distillation, named FedDK. It aims to deal with both data heterogeneity and model heterogeneity without introducing additional proxy data or auxiliary models. Concretely, by fusing knowledge distilled from the global teacher and local teacher to guide local model training, FedDK achieves the transfer of global knowledge and local knowledge in a model-agnostic manner. In addition, a dual teachers confidence assessment mechanism is designed to mitigate the client shifts caused by data heterogeneity. Finally, extensive experiments conducted on various heterogeneous data and model settings demonstrate that our method outperforms the state-of-the-art baselines on model accuracy and communication overheads.
Loading