Non-Local Similarity-Based Attentive Graph Convolution Network for Remote Sensing Image Super-Resolution

Published: 01 Jan 2024, Last Modified: 07 Nov 2024IEEE Trans. Geosci. Remote. Sens. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Single-image super-resolution (SISR) for high-resolution (HR) remote sensing image (RSI) acquisition is becoming increasingly valuable and important, and convolutional neural networks (CNNs) have produced considerable progress in this field. In RSIs, many similar geo-objects recur within the same scene, maintaining the same positions in both low resolution (LR) and HR. Based on this observation, we found that these similar geo-objects could be utilized to reconstruct texture details in LR by exploiting the consistent non-local relationships between these geo-objects in LR and HR, thereby improving the quality of SISR. Therefore, we propose a novel graph convolutional network (GCN) for SISR including a dynamic graph attention mechanism to learn the in-scale and cross-scale non-local features of RSIs. In scale, we propose a dynamic graph attention block (DGAB) that adaptively determines non-local patches upon the scene correlation derived from RSIs and further fuses patch-wise non-local information weighed by the attention scores of topological relationships and radiation characteristics in RSIs. Across different scales, we also introduce a dynamic graph attention mixing block (DGAMB) to upsample LR non-local information to HR non-local information. Most SISR methods have the upsampling blocks at the end of the network, ignoring feature extraction in high-dimensional space. To address this problem, DGAMB was designed as an upsampler in the middle of the model, enhancing the level of high-dimensional information extraction from the model. The experiments based on the WHU Building and UC Merced datasets show that our proposed method outperforms state-of-the-art methods. Our code is available at https://github.com/WenjuanZhang-aircas/NSGCN .
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview