A Cross-Level Spectral-Spatial Joint Encode Learning Framework for Imbalanced Hyperspectral Image Classification

Published: 01 Jan 2022, Last Modified: 13 Nov 2024IEEE Trans. Geosci. Remote. Sens. 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Convolutional neural networks (CNNs) have dominated the research of hyperspectral image (HSI) classification, attributing to the superior feature representation capacity. Fast patch-free global learning (FPGA) as a fast learning framework for HSI classification has received wide interest. Despite their promising results from the perspective of fast inference, recent works have difficulty modeling spectral–spatial relationships with imbalanced samples. In this article, we revisit the encoder–decoder-based fully convolutional network (FCN) and propose a cross-level spectral–spatial joint encoding (CLSJE) framework for imbalanced HSI classification. First, a multiscale input encoder and multiple-to-one multiscale features connection are introduced to obtain abundant features and facilitate multiscale contextual information flow between the encoder and the decoder. Second, in the encoder layer, we propose the spectral–spatial joint attention (SSJA) mechanism consisting of high-frequency spatial attention (HFSA) and spectral-transform channel attention (STCA). HFSA and STCA encode spectral–spatial features jointly to improve the learning of the discriminative spectral–spatial features. Powered by these two components, CLSJE enjoys a high capability to capture both spatial and spectral dependencies for HSI classification. Besides, a class-proportion sampling strategy is developed to increase the attention to insufficiency samples. Extensive experiments demonstrate the superiority of our proposed CLSJE both at classification accuracy and inference speed, and show the state-of-the-art results on four benchmark datasets. Code can be obtained at https://github.com/yudadabing/CLSJE .
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview