Keywords: dynamic label structure, foundation models, incremental learning
Abstract: Humans experience the world as a series of connected events, which can be organized hierarchically based on their conceptual knowledge. Drawing from this cognitive insight, we explore how our natural ability to organize and relate information can revolutionize the training of deep learning models. Our novel approach directly addresses the challenge of catastrophic forgetting by *leveraging the relationships within continuously emerging class data*. In particular, by creating a tree structure from an expanding set of labels, we uncover fresh perspectives on the data relationship, pinpointing groups of similar classes that easily lead to confusion. Additionally, we dive deeper into the hidden connections between classes by analyzing the behavior of the original pretrained model via an optimal transport-based approach. From these revelations, we propose a novel regularization loss function that encourages models to focus on challenging areas of knowledge, effectively boosting performance. Our experimental results demonstrate our effectiveness across a range of Continual learning benchmarks, paving the way for more effective AI systems.
Supplementary Material: pdf
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 4040
Loading