Abstract: Incremental semantic segmentation focuses on continually learning the segmentation of new coming classes without obtaining the training data from previously seen classes. However, most current methods fail to tackle catastrophic forgetting and background shift since they 1) treat all previous classes equally without considering different forgetting paces caused by imbalanced gradient back-propagation; 2) lack strong semantic guidance between classes. In this paper, to solve the aforementioned challenges, we propose a G radient- S emantic C ompensation ( GSC ) model, which surmounts incremental semantic segmentation from both gradient and semantic perspectives. Specifically, to handle catastrophic forgetting from the gradient aspect, we develop a step-aware gradient compensation that can balance forgetting paces of previously seen classes by re-weighting gradient back-propagation. Meanwhile, we propose a soft-sharp semantic relation distillation to distill consistent inter-class semantic relations via soft labels for alleviating catastrophic forgetting from the semantic aspect. In addition, we design a prototypical pseudo re-labeling which provides strong semantic guidance to mitigate background shift. It produces high-quality pseudo labels for background pixels belonging to previous classes by assessing distances of pixels relative to class-wise prototypes. Experiments on three public segmentation datasets provide strong evidence for the effectiveness of our proposed GSC model.
Loading