Online Contrastive Continual Learning with Hard Negative Samples

Published: 01 Jan 2025, Last Modified: 05 Aug 2025ICASSP 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Online continual learning (OCL) is a strict setting of continual learning (CL), where the OCL agent faces a never-ending data stream and encounters each new sample only once. An OCL agent suffers more serious catastrophic forgetting (i.e., forgetting previous knowledge of old classes) than a CL agent. Existing OCL methods leverage the contrastive-based losses to improve the classifier’s ability against forgetting. However, almost all these methods ignore the role of hard negative samples in these losses, and these samples are difficult to distinguish from positive samples, exacerbating catastrophic forgetting. In this paper, we focus on the classification task in the online contrastive continual learning (OCCL) setting and propose a novel contrastive-based OCL method named OCCL with Hard Negative Samples (OHNS) which emphasizes the importance of hard negative samples. Concretely, OHNS designs an adaptive weight to measure the hardness of the negative sample and proposes a new contrastive-based loss function by combining this loss function with the weight to enhance the classification strength on hard negative samples. We conduct extensive experiments on three real-world benchmark datasets, and the results demonstrate the superiority of OHNS over various state-of-the-art OCL methods.
Loading