Abstract: Long-tailed class-incremental learning (LT-CIL) aims to learn new classes continuously from a long-tailed data stream, while simultaneously dealing with challenges such as imbalanced learning of tail classes and catastrophic for-getting. To address these challenges, most existing methods employ a two-stage strategy by initializing model training from scratch with further balanced knowledge driven cali-bration. This strategy faces challenges in deriving discrim-inative features from cold-started backbones for the long-tailed distribution of data, consequently leading to relatively diminished performance. In this paper, with the pow-erful feature extraction capability of pre-trained foundation models, we have achieved a one-stage approach that de-livers superior performance. Specifically, we propose Dy-namic Adapter Tuning (DAT), which employs a dynamic adapter cache mechanism to adapt a pre-trained model to learn tasks sequentially. The adapter in the cache is either dynamically selected or created according to task similar-ity, and further compactified with the new task's adapter to mitigate cross-task and cross-class gaps in LT-CIL, sig-nificantly alleviating catastrophic forgetting and imbalance learning issues, respectively. With extensive experimental validation, our method consistently achieves state-of-the-art performance under the challenging LT-CIL setting.
External IDs:dblp:conf/wacv/GuYYWZGD25
Loading