Addressing Catastrophic Forgetting by Modulating Global Batch Normalization Statistics for Medical Domain Expansion

Published: 2024, Last Modified: 08 Dec 2025AIPAD/PILM@MICCAI 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Model brittleness across datasets is a key concern when deploying deep learning models in real-world medical settings. One approach is to fine-tune the model on subsequent datasets after training on the original dataset. However, this degrades model performance on the original dataset, a phenomenon known as catastrophic forgetting. We develop an approach to address catastrophic forgetting by combining elastic weight consolidation with a simple yet novel modulation of global batch normalization statistics under two scenarios: expanding the domain across 1) imaging systems and 2) hospital institutions. Focusing on the clinical use case of mammographic breast density detection, we show that our approach empirically outperforms several other state-of-the-art approaches and provides theoretical justification for the efficacy of batch normalization modulation, demonstrating the potential of our approach to deploying clinical deep learning models requiring domain expansion.
Loading