A Novel Multibranch Self-Distillation Framework for Optimizing Remote Sensing Change Detection

Ziyuan Liu, Jiawei Zhang, Wenyu Wang, Yuantao Gu

Published: 01 Jan 2025, Last Modified: 06 Nov 2025IEEE Journal of Selected Topics in Applied Earth Observations and Remote SensingEveryoneRevisionsCC BY-SA 4.0
Abstract: Deep learning (DL) has achieved remarkable success in the field of change detection (CD) for remote sensing images. Existing training methods for DL-based CD models are predominantly single-stage, single-stream, and end-to-end. Despite the numerous optimization techniques proposed, such as advanced network architectures, loss functions, and hyperparameter tuning, these methods still struggle to achieve consistent and satisfactory detection results across images with varying change area ratios (CARs). This raises the critical question: Is the current training paradigm truly optimal? To address this question, we propose a novel multibranch self-distillation training framework. In this framework, different partition branches learn detection patterns under diverse CAR scenarios and guide the main branch through distillation. Our approach consistently enhances the detection accuracy of CD models across various change regions without introducing additional time or computational costs during the inference phase. Extensive experiments on the JL1-CD, SYSU-CD, and CDD datasets demonstrate that the MBSD framework consistently improves the performance of CD models with diverse network architectures and parameter sizes, achieving new state-of-the-art results.
Loading