BISLERi: Ask Your Neural Network Not To Forget In Streaming Learning Scenarios

TMLR Paper52 Authors

18 Apr 2022 (modified: 17 Sept 2024)Rejected by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: This paper introduces a new method for \emph{class-incremental streaming learning}. In streaming learning, a learner encounters one single training example at a time and is constrained to: $(i)$ utilize each sample only once, i.e., single-pass learning, $(ii)$ adapt the parameters immediately, $(iii)$ effectively predict on any new example at any time step without involving any additional computation, i.e., it can be evaluated anytime, and $(iv)$ minimize the storage cost. Moreover, in streaming setting, the input data-stream cannot be assumed i.i.d, that is, there can be a temporal coherence in the input data-stream. Finally, the class-incremental learning implies that the learner does not require any task-id for the inference. A wide variety of the existing lifelong learning approaches are either designed to utilize more than one example once/multiple times or not optimized for the fast update or anytime inference. The premise of their designs, as well as other aspects (e.g., memory buffer/replay size, the requirement of fine-tuning), render some of the existing methods sub-optimal, if not ill-suited, for streaming learning setup. We propose a streaming Bayesian framework that enables fast parameter update of the network, given a single example, and allows it to be evaluated anytime. In addition, we also apply an implicit regularizer in the form of snap-shot self-distillation to effectively minimize the information loss further. The proposed method utilizes a tiny-episodic memory buffer and replays to conform with the streaming learning constraints. We also propose an efficient online memory replay and buffer replacement policies that significantly boost the model's performance. Extensive experiments and ablations on multiple datasets in different scenarios demonstrate the superior performance of our method over several strong baselines.
Submission Length: Long submission (more than 12 pages of main content)
Changes Since Last Submission: We have incorporated the suggestions mentioned by the reviewers and updated the paper. Our updated content is colored in blue.
Assigned Action Editor: ~Stanislaw_Kamil_Jastrzebski1
Submission Number: 52
Loading