Traceable Federated Continual Learning

Published: 01 Jan 2024, Last Modified: 19 Feb 2025CVPR 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Federated continual learning (FCL) is a typical mecha-nism to achieve collaborative model training among clients that own dynamic data. While traditional FCL methods have been proved effective, they do not consider the task repeatability and fail to achieve good performance under this practical scenario. In this paper, we propose a new paradigm, namely Traceable Federated Continual Learning (TFCL), aiming to cope with repetitive tasks by tracing and augmenting them. Following the new paradigm, we de-velop TagFed, a framework that enables accurate and ef-fective Tracing, augmentation, and Federation for TFCL. The key idea is to decompose the whole model into a se-ries of marked sub-models for optimizing each client task, before conducting group-wise knowledge aggregation, such that the repetitive tasks can be located precisely and fed-erated selectively for improved performance. Extensive ex-periments on our constructed benchmark demonstrate the effectiveness and efficiency of the proposed framework. We will release our code at: https://github.com/POwerWeirdo/TagFCL.
Loading