STRobustNet: Efficient Change Detection via Spatial-Temporal Robust Representations in Remote Sensing

Published: 01 Jan 2025, Last Modified: 13 Apr 2025IEEE Trans. Geosci. Remote. Sens. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Vision transformers have achieved impressive performance in addressing spatial–temporal inconsistencies of change detection (CD) due to their ability to model long-range dependencies in bitemporal features. However, applying the self-attention operation directly to bitemporal features causes feature confusion and incurs high computational complexity. In this article, we propose a novel CD framework based on spatial–temporal robust representation (STRobustNet). To avoid directly applying self-attention to bitemporal features, we derive a collection of STRobustNets for land-use classes of interest. These representations are used to transform spatial–temporally inconsistent bitemporal features into consistent bitemporal classification representations, enhancing model efficiency and reducing feature confusion. Specifically, to fully perceive and integrate the varied appearances of the same class while minimizing deviation from the current input samples, we design a robust representation generation module (RRGModule), which utilizes the universal spatial–temporal context provided by the entire dataset and the specific spatial–temporal context from the current bitemporal images to generate robust representations for the interested land-use classes, improving robustness to spatial–temporal inconsistencies. Then, these representations are used to activate bitemporal features in different class channels, producing spatial–temporally consistent classification representations. The CD is ultimately performed using these consistent representations, effectively avoiding false detections caused by spatial–temporal inconsistencies. Experimental results demonstrate that STRobustNet achieves performance comparable to top-performing methods and offers the fastest inference speed among transformer-based methods. Code and pretrained models are available at https://github.com/DLUTTengYH/STRobustNet.
Loading