Abstract: Non-exemplar class incremental learning aims to learn both
the new and old tasks without accessing any training data
from the past. This strict restriction enlarges the diffculty
of alleviating catastrophic forgetting since all techniques can
only be applied to current task data. Considering this challenge, we propose a novel framework of fne-grained knowledge selection and restoration. The conventional knowledge
distillation-based methods place too strict constraints on the
network parameters and features to prevent forgetting, which
limits the training of new tasks. To loose this constraint, we
proposed a novel fne-grained selective patch-level distillation to adaptively balance plasticity and stability. Some taskagnostic patches can be used to preserve the decision boundary of the old task. While some patches containing the important foreground are favorable for learning the new task.
Moreover, we employ a task-agnostic mechanism to generate more realistic prototypes of old tasks with the current task
sample for reducing classifer bias for fne-grained knowledge
restoration. Extensive experiments on CIFAR100, TinyImageNet and ImageNet-Subset demonstrate the effectiveness of
our method. Code is available at https://github.com/scok30/
vit-cil.
Loading