Saliency-Guided Adaptive Random Diffusion for Remote Sensing Images Restoration with Cloud and Haze

Published: 30 Oct 2025, Last Modified: 12 Nov 2025Dublin, IrelandEveryoneCC BY 4.0
Abstract: Remote sensing image restoration under cloud and haze occlusions poses a significant challenge due to severe spectral degradation and spatial distortions. While recent generative models have shown promise in image restoration, they struggle with three key issues: (1) Lack of precise annotations, making supervised methods unreliable; (2) Unintended interference with clear regions, leading to distortion in unaffected areas; (3) Spectral and structural inconsistencies in heavily occluded regions, limiting realistic recovery. To address these challenges, we propose Saliency-Guided Adaptive Random Diffusion Strategy(SG-ARD), a novel blind restoration framework that integrates saliency-aware guidance with adaptive diffusion for enhanced reconstruction. First, we introduce a Saliency-Guided Pseudo-label Generation module (SGPG) to identify degraded regions and generate pseudo-labels for blind restoration. Second, we propose an Adaptive Random Diffusion Correction Strategy (ARDC), which employs a Random-Walk-based Diffusion and an Adaptive Enhancement module to refine local and global texture pseudo-labels. Lastly, we design a Spectral-Aware Consistency Loss (SAC) to improve spectral fidelity, ensuring that the generated content aligns with the real spectral distribution. Extensive experiments on three large-scale remote sensing datasets demonstrate that SG-ARD outperforms state-of-the-art generative restoration models, producing high-fidelity, visually coherent remote sensing images.
Loading