Bidirectional domain transfer knowledge distillation for catastrophic forgetting in federated learning with heterogeneous data

Published: 2025, Last Modified: 16 Oct 2025Knowl. Based Syst. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Federated learning (FL) is a distributed machine learning approach that has gained significant attention owing to its advantages in privacy protection and data security. However, the heterogeneous data distribution among clients makes these models susceptible to catastrophic forgetting during continuous learning, resulting in the loss of knowledge from previous tasks. To address this issue, this paper presents a bidirectional domain transfer knowledge distillation framework that incorporates knowledge transfer and knowledge retrospection modules. The proposed framework enhances knowledge preservation and sharing capabilities in FL scenarios. The knowledge transfer module mitigates knowledge forgetting due to data heterogeneity by distilling key knowledge between global and local models. Meanwhile, the knowledge retrospection module allows the model to review historical information when updating the current task, improving stability and adaptability. The experimental results demonstrate that the proposed framework significantly reduces catastrophic forgetting in non-independent and identically distributed (non-IID) data environments and improves the model’s generalization performance. Compared to traditional FL methods, the proposed method achieves better knowledge preservation across various datasets.
Loading