Multipatch Progressive Pansharpening With Knowledge DistillationDownload PDFOpen Website

2023 (modified: 26 Apr 2023)IEEE Trans. Geosci. Remote. Sens. 2023Readers: Everyone
Abstract: In this article, we propose a novel multipatch and multistage pansharpening method with knowledge distillation, termed PSDNet. Different from the existing pansharpening methods that typically input single-size patches to the network and implement pansharpening in an overall stage, we design multipatch inputs and a multistage network for more accurate and finer learning. First, multipatch inputs allow the network to learn more accurate spatial and spectral information by reducing the number of object types. We employ small patches in the early part to learn accurate local information, as small patches contain fewer object types. Then, the later part exploits large patches to fine-tune it for the overall information. Second, the multistage network is designed to reduce the difficulty of the previous single-step pansharpening and progressively generate elaborate results. In addition, instead of the traditional perceptual loss, which hardly relates to the specific task or the designed network, we introduce distillation loss to reinforce the guidance of the ground truth. Extensive experiments are conducted to demonstrate the superior performance of our proposed PSDNet to the existing state-of-the-art methods. Our code is available at <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">https://github.com/Meiqi-Gong/PSDNet</uri> .
0 Replies

Loading