Streaming LifeLong Learning With Any-Time Inference

Published: 01 Jan 2023, Last Modified: 05 Nov 2024ICRA 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Despite rapid advancements in the lifelong learning (LL) research, a large body of research mainly focuses on improving the performance in the existing static continual learning (CL) setups. These methods lack the ability to succeed in a rapidly changing dynamic environment, where an AI agent needs to quickly learn new instances in a ‘single pass' from the non-i.i.d (also possibly temporally contiguous/coherent) data streams without suffering from catastrophic forgetting. For practical applicability, we propose a novel lifelong learning approach, which is streaming, i.e., a single input sample arrives in each time step. Moreover, the proposed approach is single pass, class-incremental, and is subject to be evaluated at any moment. To address this challenging setup and various evaluation protocols, we propose a Bayesian framework, that enables fast parameter update, given a single training example, and enables any-time inference. We additionally propose an implicit regularizer in the form of snap-shot self-distillation, which effectively minimizes the forgetting further. We further propose an effective method that efficiently selects a subset of samples for online memory rehearsal and employs a new replay buffer management scheme that significantly boosts the overall performance. Our empirical evaluations and ablations demonstrate that the proposed method outperforms the prior works by large margins.
Loading