Keywords: continual learning, online learning, self supervised learning, representation learning
TL;DR: We identify a novel collapse phenomenon in Online Continual SSL, namely Latent Rehearsal Decay, caused by lack of plasticity; we address it with SOLAR, a latent-aware strategy that enforces quality in representation space.
Abstract: Continual learning methods enable models to learn from non-stationary data without forgetting. We study Online Continual Self-Supervised Learning (OCSSL), in which models learn from a continuous stream of unlabeled data. We find that OCSSL exhibits surprising learning dynamics, favoring plasticity over stability, with a simple FIFO buffer outperforming Reservoir sampling. We explain this result with the Latent Rehearsal Decay hypothesis, which attributes it to latent space degradation under excessive stability of replay. To quantify this effect, we introduce two metrics (Overlap and Deviation) and show their correlation with declines in probing accuracy. Building on these insights, we propose SOLAR, which leverages efficient online proxies of Deviation to guide buffer management and incorporates an explicit Overlap loss. Experiments demonstrate that SOLAR achieves state-of-the-art performance on OCSSL vision benchmarks, highlighting its effectiveness in balancing convergence speed and final performance.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 16757
Loading