Centripetal Intensive Deep Hashing for Remote Sensing Image Retrieval

Published: 2025, Last Modified: 23 Jan 2026IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: With the breakthrough of convolutional neural networks, deep hashing methods have demonstrated remarkable performance in large-scale image retrieval tasks. However, existing deep supervised hashing methods, which rely on pairwise or triplet labels, typically learn the hash function via random or hardest sample mining within training batches. This strategy primarily captures local sample similarities, causing a distribution shift and limiting retrieval performance. Furthermore, most methods emphasize global features while overlooking structural information, which is essential for understanding spatial relationships in images. To solve these limitations, we propose a Centripetal Intensive Deep Hashing (CIDH) method for remote sensing image retrieval. Initially, we design a Hybrid-Attention Guided Multiscale Refinement Network that integrates channel and spatial attention to capture multiscale visual features and highlight salient regions at different scales. Subsequently, we introduce a central similarity loss via class-centered labels to optimize the spatial distribution of global samples, which can encourage hash codes with similar semantics to cluster around centroids and reduce distribution shift. Meanwhile, we incorporate a central intensive loss into the Hamming space to shorten intraclass Hamming distances, generating more compact and discriminative hash codes. Extensive experiments demonstrate the superiority of our CIDH method compared with current state-of-the-art deep hashing methods.
Loading