Memory efficient data-free distillation for continual learning

Published: 01 Jan 2023, Last Modified: 04 Nov 2024Pattern Recognit. 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•We focus on the setting that the training data of previous tasks are unavailable.•We propose a novel memory efficient data-free distillation method.•Our method encodes knowledge of previous datasets into parameters for distillation.•Our method shows superiority on multiple continual learning benchmark datasets.
Loading