Self-Supervised Localized Topology Consistency for Noise-Robust Hyperspectral Image Classification

Published: 2025, Last Modified: 12 Nov 2025ICASSP 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Label noise in hyperspectral image classification (HIC) can severely degrade model performance by leading to incorrect predictions and overfitting, especially as erroneous labels propagate and compound throughout the training process. To address this, we propose a robust learning framework called Self-Supervised Localized Topology Consistency (SSLTC), which enforces local topology consistency to enhance model resilience against noisy labels. SSLTC captures local topology via a graph-based representation, where nodes represent samples and edges encode pairwise similarities. Predictions are propagated from topologically similar nodes to central nodes, constrained by Kullback-Leibler (KL) divergence to encourage consistent predictions and reduce sensitivity to noisy labels. Additionally, a self-supervised contrastive learning strategy is used to refine spectral-spatial representations in an unsupervised manner, further improving robustness. Extensive experiments on hyperspectral benchmark datasets with varying noise levels demonstrate the superiority of SSLTC in mitigating the adverse effects of label noise compared to state-of-the-art approaches in HIC tasks.
Loading