Look Back to Move Forward: Delay-Aware Instance Selection for Online Continual Learning

ICLR 2026 Conference Submission17061 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Online continual learning, delayed labels, Efficient Training, Data Streams
TL;DR: We study continual learning with delayed labels and propose a delay-aware instance selection that speeds recovery after shifts while cutting training updates.
Abstract: Supervised continual learning (CL) typically assumes that labels are available immediately after each input arrives. This is unrealistic in many streaming applications, where annotation latency is the norm. When labels arrive late, supervision for past tasks can spill into later tasks, entangling training signals and degrading current performance. We study this delayed-label setting and analyze how different delay regimes impact online CL. We then introduce a delay-aware instance selection strategy that prioritizes which late-labeled examples to use for updates based on a simple, model-utility criterion. By selecting only the most beneficial delayed instances, our approach accelerates performance recovery after task shifts and reduces the training budget when labels from multiple past tasks arrive simultaneously. Our contributions are: (i) a clear problem formulation and evaluation protocol for online continual learning with delayed labels; (ii) an empirical analysis across delay regimes showing how label latency mixes supervision across tasks; and (iii) a plug-and-play instance-selection method compatible with replay-based CL. Experiments indicate consistent improvements in current-task accuracy and stability, with fewer update steps than delay-agnostic baselines.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 17061
Loading