Keywords: Continual Learning, Long-Tail Class-incremental Learning, Domain Shift
Abstract: Long-tail class-incremental learning (LTCIL) requires models to sequentially learn new classes from long-tailed data distributions while mitigating catastrophic forgetting. However, existing methods predominantly assume that each incremental task originates from a single domain, neglecting the practical scenario where tasks involve heterogeneous domains. To address this limitation, we introduce \textbf{Cross-Domain LTCIL (CD-LTCIL)}, a challenging and underexplored setting where each task consists of data from multiple domains. We observe that conventional LTCIL methods exhibit significant performance degradation under cross-domain semantic shifts due to their limited domain generalization capability. To overcome these challenges, we propose \textbf{C2C} (Contrastive-and-Correlation Catalysts), a parameter-efficient framework that maintains a frozen pre-trained backbone and learns lightweight catalyst pathways. For the first task, C2C employs cosine anchoring combined with bi-level contrastive learning to establish domain-invariant class representations. For subsequent tasks, it preserves previously acquired knowledge through cross-correlation distillation between a frozen base catalyst and a learnable incremental catalyst. Extensive experiments on standard LTCIL benchmarks (CIFAR-100, ImageNet-R) and the proposed cross-domain Hybrid-DomainNet demonstrate that our approach achieves state-of-the-art performance across all evaluated scenarios, establishing a strong foundation for real-world long-tail continual learning under multi-domain conditions. The code will be made publicly available.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 13502
Loading