Abstract: Accurate cloud detection is essential for subsequent optical remote sensing imagery processing. Although many deep learning (DL)-based cloud detection methods have been proposed, the accurate detection of thin clouds still remains a challenge. To solve this issue, this article introduces a thin cloud-aware network (TANet). TANet tackles the problem from the aspects of color, texture, spatial distribution, and feature by constructing unique strategies to enhance the sensitivity of the network to thin cloud regions, thereby improving the overall accuracy of cloud detection. On the one hand, the TANet utilizes a color prior guidance module (CPGM) to incorporate robust dehaze priors, guiding the network to pay more attention to thin cloud areas. On the other hand, the global information aggregation module (GIAM) is employed to deeply extract long-distance dependence between pixels, mining potential correlations between thin and thick clouds, and addressing the challenge of recognizing thin clouds in a local perspective. In addition, we construct a plug-and-play cloud feature difference (CFD) loss, encouraging the network to learn more distinctive features between pixels of thin clouds and cloud-free regions, thereby strengthening the network’s ability to distinguish highly similar samples of different classes. The experimental results substantiate that our proposed method attains the lowest omission error and the highest detection accuracy. This affirms the superior capability of TANet in thin cloud detection, thereby yielding more dependable cloud detection results.
Loading