Multidimensional Remote Sensing Change Detection Based on Siamese Dual-Branch Networks

Published: 01 Jan 2025, Last Modified: 11 Apr 2025IEEE Trans. Geosci. Remote. Sens. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Deep learning models, particularly convolutional neural networks (CNNs), have demonstrated outstanding feature learning capabilities, leading to remarkable performance in remote sensing change detection (RSCD) tasks. However, their most critical drawback lies in the lack of effective modeling of global information. This deficiency affects the model’s understanding of the overall context and structure of the entire image, making it difficult to distinguish between background and target areas, thereby leading to the erroneous identification of change regions. Second, features extracted by traditional backbone networks contain a significant amount of noise, resulting in blurred boundaries of changed objects. The challenge of effectively fusing detailed and semantic information to accurately differentiate pseudo changes remains significant. Furthermore, how to fully exploit multiscale information is another issue worth considering. We propose a full-scale multidimensional interaction network called SDSN, which enhances feature representation by leveraging both detail and semantic branches. Initially, bi-temporal images are processed by the encoder to extract coarse multiscale features. The semantic branch guides shallow-scale features, while the detail branch focuses on deep-scale features. Multikernel receptive module (MRM) aggregates global information. The detail branch utilizes a diversity variance module (DVM) and differential operations to generate refined change maps with noise reduction and background suppression. A multidimensional cross-perception module (MCM) guides the fusion of these change maps, establishing multidimensional dependencies to enrich feature representation. Compared with previous methods, SDSN demonstrates greater performance under complex environmental conditions, particularly noteworthy for its fewer parameters (4.03 M) and lower computational costs (7.94 G). The code is publicly available at https://github.com/dpt000121/dpt.
Loading