Abstract: This paper delves into federated class-incremental learning (FCiL), where new classes appear continually or even privately to local clients.
However, existing FCiL methods suffer from the problem of spatial-temporal catastrophic forgetting, i.e., forgetting the previously learned knowledge over time and the client-specific information owned by different clients.
Additionally, private class and knowledge heterogeneity amongst local clients further exacerbate spatial-temporal forgetting, making FCiL challenging to apply.
To address these issues, we propose Federated Class-specific Binary Classifier (FedCBC), an innovative approach to transferring and fusing knowledge across both temporal and spatial perspectives.
FedCBC consists of two novel components: (1) continual personalization that distills previous knowledge from a global model to multiple local models, and (2) selective knowledge fusion that enhances knowledge integration of the same class from divergent clients and shares private knowledge with other clients.
Extensive experiments using three newly-formulated metrics (termed GA, KRS, and KRT) demonstrate the effectiveness of the proposed approach.
Primary Subject Area: [Content] Vision and Language
Secondary Subject Area: [Content] Multimodal Fusion
Relevance To Conference: Our work focuses on continually fusing heterogeneous knowledge from different clients while avoiding temporal-spatial catastrophic forgetting in federated continual learning. This is highly relevant to the conference's theme of multimoda, as this paper focuses on extracting and integrating knowledge from heterogeneous models from different clients.
Supplementary Material: zip
Submission Number: 3672
Loading