Abstract: Unsupervised hyperspectral change detection (UHCD), detecting subtle changes between bi-temporal images without manual annotations, is an essential but challenging task in the earth observation community. The current modus operandi often performs it in a feature comparison manner, which is limited by variations in imaging conditions. We observe that fully supervised paradigms using limited annotations are capable of overcoming this challenge. Based on this, we introduce a novel Observational Learning Paradigm (OraL) for UHCD by mimicking fully supervised paradigms. OraL comprises two sequential stages: Observation, which designs a spatial-temporal observation strategy (STO) that records the learning consistency of pixels under different training steps and views, to obtain reliable pseudo-labels. Reproduction, which retrains the model with these pseudo-labels and introduces a distribution-aware spectral learning strategy (DSL) to adaptively increase their learning difficulty according to spectral distributions, enhancing the robustness and generalization of the model. Extensive experiments on several public hyperspectral image datasets demonstrate its state-of-the-art performance and pluggability for previous unsupervised methods. The code is available at: https://github.com/GC-WSL/OraL.
Loading