3SD-Net: SAR Small Ship Detection Neural Network

Published: 01 Jan 2024, Last Modified: 07 Apr 2025IEEE Trans. Geosci. Remote. Sens. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This article studies a practically meaningful ship detection problem from synthetic aperture radar (SAR) images by neural network. We broadly extract different types of SAR image features and raise the intriguing question whether these extracted features are beneficial to: 1) suppress data variations (e.g., complex land-sea backgrounds, scattered noise) of real-world SAR images and 2) enhance the features of ships that are small objects and have different aspect (length-width) ratios, therefore resulting in the improvement of ship detection. To answer this question, we propose an SAR-ship detection neural network (called 3SD-Net for short), by newly developing bidirectional coordinate attention (BCA) and multiresolution feature fusion (MRF) and a center point distribution module (CPDM) based on CenterNet. In detail, we first develop BCA to make 3SD-Net focus on ship features as much as possible while ignoring the background noise. Second, we leverage MRF to enhance the spatial information of small-scale ships yet solve the nontrivial problem of small-scale and shallower pixels easily lost after deep convolution in SAR images. Moreover, considering the varying length-width ratio of arbitrary ships, we study the probability distribution around the ship center. We concentrate on enhancing the distribution function of the ship center, thereby significantly improving the performance of the basic CenterNet detector. This improvement is achieved without incurring additional computational and time costs. The experimental results obtained from the public SAR-Ship and SSDD datasets demonstrate the superior performance of our method compared with its competitors. Specifically, our 3SD-Net achieves average precision (AP) values of 91.66% and 90.22% on the two datasets, respectively, outperforming YOLOV7 (90.31% and 87.32%) and EfficientVit (90.06% and 90.08%). Source code will be released upon publication.
Loading