Abstract: As global climate change worsens, floods are becoming more severe and frequent, urgently demanding effective flood risk mitigation strategies. Timely and precise flood inundation mapping is crucial for emergency response and relief. The 2024 IEEE GRSS Data Fusion Contest Track 2 aims to pioneer innovative algorithms for accurate flood extraction using multi-source optical remote sensing (RS) data. However, data diversity introduces aleatoric uncertainty, especially with synthetic, non-real data. Meanwhile, the vast coverage of RS imagery and the small proportion of flood areas cause a significant class imbalance, leading to epistemic uncertainty. In this paper, we propose an Uncertainty-aware Detail-Preserving Network (UADPNet) for rapid flood mapping of multi-source optical data. Firstly, we design an Aleatoric Uncertainty Estimator to model aleatoric uncertainty in multi-source data. Secondly, we introduce a Multi-Scale Convolution Block to extract multi-scale information without downsampling. Thirdly, we utilize a multi-level supervised strategy to quantify epistemic uncertainty and highlight uncertain pixels via the Uncertainty-Aware Fusion Module. With UADPNet, we adopt a multi-model fusion and post-processing strategy to enhance the final flood extraction accuracy. Our outstanding experimental results in the official test set showcase the superiority of our method, which secured the first-place ranking in the 2024 IEEE GRSS Data Fusion Contest Track 2, boasting an impressive F1 score of 89.843%.
Loading