CapT: A Hierarchical Capsule Representation Learning Approach for Class Continual Learning

Published: 01 Jan 2025, Last Modified: 20 Sept 2025ICASSP 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Human brain is adept at continually acquiring new skills through structured patterns without forgetting previous learning, a feat that neural networks struggle to emulate. When these networks are exposed to new information, they tend to experience a significant decline in performance on tasks they were previously trained on, a phenomenon known as catastrophic forgetting. We introduce a novel approach to representation learning that utilizes a hierarchical structure. In our proposed method, individual classes are encapsulated and dynamically routed to maintain relationships between similar classes at different levels. Our model develops into a tree-like architecture, with each phase adding nodes that contain capsules of semantically related classes. During training, only the nodes representing new classes and their children are trained, while the rest of the tree remains frozen. This effectively preserves the representations of older classes.
Loading