A Dual-Channel Collaborative Transformer for continual learning

Published: 01 Jan 2025, Last Modified: 17 Apr 2025Appl. Soft Comput. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•Propose a new scalable incremental learning framework with minimal parameter growth.•Design two channels for stability and plasticity, and enable their collaboration.•Add a distillation loss to enhance knowledge generalization in intermediate layers.•The proposed method outperforms classical methods across datasets and settings.
Loading