Multiscale Bilateral Attention Fusion Network for Pansharpening

Published: 01 Jan 2024, Last Modified: 14 May 2025IEEE Trans. Artif. Intell. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: High-resolution multispectral (HRMS) images combine spatial and spectral information originating from panchromatic (PAN) and reduced-resolution multispectral (LRMS) images. Pansharpening performs well and is widely used to obtain HRMS images. However, most pansharpening approaches determine the ratio of PAN and LRMS images through direct interpolation, which may introduce artifacts and distort the color of the fused results. To address this issue, an unsupervised progressive pansharpening framework, MSBANet, is proposed, which adopts a multistage fusion strategy. Each stage contains an attention interactive extraction module (AIEM) and a multiscale bilateral fusion module (MBFM). The AIEM extracts spatial and spectral features from input images and captures the correlations between features. The MBFM can efficiently integrate information from the AIEM and improve MSBANet context awareness. We design a hybrid loss function that enhances the ability of the fusion network to store spectral and texture details. In qualitative and quantitative experimental studies on four datasets, MSBANet outperformed state-of-the-art pansharpening techniques. The code will be released.
Loading