SeqFedEDT: Accelerating Sequential Federated Learning on non-IID Data via Element-wise Decoupled Training
Abstract: Sequential federated learning (SFL) trains models collaboratively across clients in a chain manner. This order shows communication efficiency compared to traditional FL in a parallel manner with a star topology. However, SFL can fail to produce stable training results when clients have significant statistical heterogeneity among their local data distributions. To address these challenges, we propose a novel element-wise model decoupling framework named SeqFedEDT that accelerates SFL training by separating model parameters of each client into a shared subset for global knowledge collaboration and a personalized subset for migrating data heterogeneity. We explore three types of parameter contribution scoring metrics based on gradient, Fisher information, and parameter importance (PI) for personalized parameter selection. In addition, we propose a quantile-based thresholding mechanism to separate shared and personalized subsets and explore the best performance quantile selection in numerical studies. Extensive experiments demonstrate that SeqFedEDT outperforms eight state-of-the-art methods across diverse datasets and heterogeneity scenarios. All code and results are available at https://github.com/tian0920/SeqFedEDT.
External IDs:doi:10.1109/tmc.2025.3635720
Loading