Keywords: Class Incremental Learning, Self-Supervised Learning
TL;DR: A novel algorithm to address catastrophic forgetting in class incremental learning using label transformation-based self-supervised learning.
Abstract: This paper proposes \textit{Self-Supervised Continual Learning (SCL)} for regularization-based class incremental learning. The novel pretext task in SCL utilizes randomly-transformed labels without depending on data-augmented transforms. \textit{SCL} trained with a novel incremental task-regularizer and an orthogonal weight modifications backbone shows promising performance on three datasets.
9 Replies
Loading