EWC-Guided Diffusion Replay for Exemplar-Free Continual Learning

ICLR 2026 Conference Submission20727 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Continual Learning, Diffusion Models, Elastic Weight Consolidation, Fisher-Scheduled Replay, Generative Replay, Exemplar-Free Learning, Medical Imaging, Catastrophic Forgetting, Theoretical Forgetting Bound
TL;DR: We propose EWC-guided diffusion replay, combining high-fidelity exemplar-free diffusion replay with Fisher-scheduled allocation and a theoretical forgetting bound for continual learning in medical imaging.
Abstract: Continual learning for medical imaging must adapt to new tasks while preserving prior competence and avoiding retention of patient examples. We present EWC-guided Diffusion Replay, a hybrid framework that combines a single class conditional diffusion model for exemplar free replay with Elastic Weight Consolidation for parameter anchoring. To target replay where it is most needed, we introduce Fisher Scheduled Replay, which allocates synthetic samples using a mixture of Fisher saliency and recent loss drift at the class level. We further provide a concise decomposition of forgetting that links retention to divergence between real and replayed data and to Fisher weighted parameter drift, clarifying how replay fidelity and synaptic stability interact. In class incremental settings without task identities and without exemplars, the method attains competitive accuracy and lower forgetting on MedMNIST v2 in two and three dimensions and on CheXpert, outperforming strong regularisation and replay baselines under a matched memory budget. The unified conditional generator is used only during training, which reduces reliance on stored data while remaining architecture agnostic.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 20727
Loading