Toward Real-World Remote Sensing Image Super-Resolution: A New Benchmark and an Efficient Model

Published: 01 Jan 2025, Last Modified: 15 May 2025IEEE Trans. Geosci. Remote. Sens. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Super-resolution (SR) is a fundamental and crucial task in remote sensing. It can improve low-resolution (LR) remote sensing images and has potential benefits for downstream tasks such as remote sensing object detection and recognition. Existing remote sensing image SR (RSISR) methods are trained on simulated paired datasets, in which LR images are obtained by a simple and uniform (i.e., bicubic) degradation from corresponding high-resolution (HR) images. However, since this simulated degradation usually deviates from the real degradation, the performance of the trained model is limited when applied to real scenarios. To address this issue, we construct a novel real-world RSISR (RRSISR) dataset to model the real-world degradation, which exploits the imaging characteristics of the spectral camera to capture paired LR-HR images of the same scene. To ensure the precise alignment of the paired images, algorithms such as image registration and geometric correction are utilized. In addition, considering the vast amount of data involved in the RSISR task and its requirement for higher efficiency, we divide the image into patches with different restoration difficulties and propose a reference table-based patch exiting (RPE) method to efficiently reduce the computation of SR. Specifically, this method incorporates a predictor to estimate the performance of the current layer and a lookup table to decide whether to exit. Extensive experiments show that models trained on the proposed RRSISR dataset produce more realistic images than models with simulated datasets and generalize well to other satellites. We also demonstrate the efficiency of our RPE.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview