SPFL: Sequential updates with Parallel aggregation for Enhanced Federated Learning under Category and Domain Shifts

Published: 18 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated Learning, Domain Shift, Category Shift
TL;DR: Sequential updates with Parallel aggregation for Enhanced Federated Learning under Category and Domain Shifts
Abstract: Federated learning (FL) has recently emerged as the primary approach to overcoming data silos, enabling collaborative model training without sharing sensitive or proprietary data. Parallel federated learning (PFL) aggregates models trained independently on each client’s local data, which can lead to suboptimal convergence due to limited data exposure. In contrast, Sequential Federated Learning (SFL) allows models to traverse client datasets sequentially, enhancing data utilization. However, SFL effectiveness is limited in real-world non-IID scenarios characterized by category shift (inconsistent class distributions) and domain shift (distribution discrepancies). These shifts cause two critical issues: update order sensitivity, where model performance varies significantly with the sequence of client updates, and catastrophic forgetting, where the model forgets previously learned features when trained on new client data. We propose SPFL, a novel updating method that can be integrated into existing FL methods, integrating sequential updates with parallel aggregation to enhance data utilization and ease update order sensitivity. At the same time, we give the convergence analysis of SPFL under strong convex, general convex, and non-convex conditions, proving that this update scheme is significantly better than PFL and SFL. Additionally, we introduce the Global-Local Alignment Module to mitigate catastrophic forgetting by aligning the predictions of the global model with those of the local and previous models during training. Our extensive experiments demonstrate that integrating SPFL into existing PFL methods significantly improves performance under category and domain shifts.
Supplementary Material: zip
Primary Area: Other (please use sparingly, only use the keyword field for more details)
Flagged For Ethics Review: true
Submission Number: 16927
Loading