Keywords: Federated Learning
Abstract: Federated Class-Incremental Learning (FCIL) aims to continually expand a model’s recognition capacity in a distributed environment, enabling it to learn new classes while retaining knowledge of previously seen ones. Exemplar replay has emerged as a promising strategy owing to its simplicity and effectiveness. Existing methods either select exemplars based on local dynamics or construct global feature spaces to identify representative samples. However, they face inherent challenges in striking a balance between effectiveness and privacy. To address this issue, this paper proposes a Cross-views Lewis weIght Fusion method for exemplar replay in FCIL, termed CLIF, which fuses multi-view importance scores to guide representative sample selection under federated settings. Specifically, CLIF consists of two main modules: 1) the cross-view Lewis weight fusion module computes and integrates Lewis weights from multiple feature perspectives to achieve consistent importance estimation, ensuring that the selected samples better reflect the global data distribution and thus enhancing the representativeness of the replay subset. Building on this, 2) the frequency-based weighted training module adjusts the loss contribution of each sample according to its selection frequency across views, which emphasizes the contribution of critical samples. Moreover, we provide a theoretical analysis to guarantee the soundness and effectiveness of CLIF. Extensive experiments on three datasets demonstrate that our method consistently improves baselines by 1%–6%, supporting the above claims.
Supplementary Material: zip
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 6747
Loading