Spatial–Frequency Residual-Guided Dynamic Perceptual Network for Remote Sensing Image Haze Removal

Hang Sun, Zhaoru Yao, Bo Du, Jun Wan, Dong Ren, Lyuyang Tong

Published: 01 Jan 2025, Last Modified: 05 Nov 2025IEEE Transactions on Geoscience and Remote SensingEveryoneRevisionsCC BY-SA 4.0
Abstract: Recently, deep neural networks have been extensively explored in remote sensing image haze removal and achieved remarkable performance. However, most existing haze removal methods fail to effectively leverage the fusion of spatial and frequency information, which is crucial for learning more representative features. Moreover, the prevalent perceptual loss used in dehazing model training overlooks the diversity among perceptual channels, leading to performance degradation. To address these issues, we propose a spatial-frequency residual-guided dynamic perceptual network (SFRDP-Net) for remote sensing image haze removal. Specifically, we first propose a residual-guided spatial-frequency interaction (RSFI) module, which incorporates a bidirectional residual complementary mechanism (BRCM) and a frequency residual enhanced attention (FREA). Both BRCM and FREA exploit spatial-frequency complementarity to guide more effective fusion of spatial and frequency information, thus enhancing feature representation capability and improving haze removal performance. Furthermore, a dynamic channel weighting perceptual loss (DCWP-Loss) is developed to impose constraints with varying strengths on different perceptual channels, advancing the reconstruction of high-quality haze-free images. Experiments on challenging benchmark datasets demonstrate our SFRDP-Net outperforms several state-of-the-art haze removal methods. The code is released publicly at https://github.com/789as-syl/SFRDP-Net.
Loading