Abstract: Remote sensing change detection (RSCD) seeks to identify areas of interest with changes in photographs from multitemporal remote sensing (RS) that are spatially co-registered, thereby monitoring land surface changes. Identifying imbalanced differences between foreground and background categories is crucial when dealing with limited samples and significant interference. Our letter proposes a dynamic interaction and adaptive fusion network (DIAFNet) designed to focus on the changes efficiently. DIAFNet uses the ResNet18 backbone network for feature extraction. It incorporates the DIAM module to enable the dynamic interaction of features from dual-temporal images, establishing a global feature distribution for autonomous learning of dependencies between diverse features. In addition, GSConv is introduced to maintain channel correlations, thereby enhancing feature representation. The design of the MFAF module uses the abstract semantic information of deep features to guide the learning process of shallow features, resulting in more precise edge information and comprehensive change areas through adaptive weighted fusion features. Finally, skip connections are used to minimize fine-grained information loss. Quantitative evaluations on the change detection dataset (CDD), SYSU-CD, and LEVIR-CD datasets show that our strategy outperforms other state-of-the-art techniques.
Loading