Few-Shot Class-Incremental Learning based on Hierarchical Dual-Stream Interaction and Associative Memory Fusion
Keywords: Continual Learning, Few-shot Learning, Few-Shot Class-Incremental Learning, Image Classification, Brain-inspired
TL;DR: We propose a cognition-inspired framework with a dual-stream network and associative memory fusion to address intra-class variance collapse and boundary instability in FSCIL, achieving better generalization and lower catastrophic forgetting.
Abstract: Few-Shot Class-Incremental Learning (FSCIL) aims to learn novel classes from limited examples while preserving previously acquired knowledge. Current methods face two challenges: (1) Collapsed intra-class variance, where enhancing base-class separability limits generalization; and (2) Boundary instability, where few novel samples distort feature distribution and cause catastrophic forgetting. To address these challenges, we propose a cognition-inspired framework that employs a dual-stream network to extract a unified representation space with strong generalization and a hierarchical fusion mechanism with associative memory to improve old and new feature distribution. This framework comprises two key modules for rapid adaptation and long-term stability. The Hierarchical Dual-Stream Interaction Network (HDIN) decouples feature learning into a ResNet-based local stream for fine-grained detail extraction and a ViT-based global stream for long-range semantic dependencies. These streams are dynamically integrated via channel-adaptive attention to harmonize multi-scale information, simulating cognitive-level feature integration. The Associative-Enhanced Hierarchical Memory Fusion (AE-HMF) module simulates cortical memory consolidation by Gaussian sampling from class prototypes as associative memories and performing cross-layer feature interactions. Experiments on CIFAR100, miniImageNet, and CUB200 show that under the setting of no large-scale pretraining or data expansion techniques, our approach achieves the lowest Performance Decline Rates (DR) across all benchmarks, delivering a state-of-the-art balance between accuracy and forgetting. This work establishes a cognition-inspired, unified framework that effectively promotes the generalization capability and reduces catastrophic forgetting in FSCIL.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 24764
Loading