Keywords: Continual Learning, Lifelong Learning, Dendritic Learning, Biologically Plausible
Abstract: Class-incremental learning (CIL) enables models to acquire new knowledge while retaining prior knowledge, thereby adapting to continuous data streams. Because parameter drift and distribution shifts are inevitable, it suffers from catastrophic forgetting and the stability–plasticity dilemma. There are various strategies to address these challenges. Nevertheless, they still remain limited by the \textit{homogeneous representations}, which reduce inter-class diversity and exacerbate forgetting. To overcome this bottleneck, we introduce dendritic learning (DeL), a biologically inspired framework that reduces homogeneous representations and thereby mitigates catastrophic forgetting. DeL leverages synaptic plasticity and multi-branch dendrites to extract diverse, discriminative features, fostering heterogeneous representation learning. A membrane layer integrates these features, and a subsequent somatic layer adapts them for downstream classification. By strengthening class-specific features, DeL also promotes robust memory consolidation. Experiments show that augmenting state-of-the-art CIL methods with DeL consistently boosts accuracy. Furthermore, DeL encourages more efficient representation learning, allowing the model to rely on fewer discriminative features. Code is available at https://github.com/anonymous/DeL.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 15154
Loading