Abstract: Synthetic Aperture Radar (SAR) images contain a dense clutter of objects that can be better characterized using bounding boxes with angles. However, accurately detecting the angles of objects remains challenging due to the imaging mechanism of SAR. To address this issue, we propose a novel knowledge distillation method called cross-modal Gaussian Localization Distillation (GaLD). It aims to improve SAR object detection performance by utilizing the angle information from optical images. Specifically, we convert the oriented bounding box into a Gaussian distribution and design a Gaussian Angle Distillation (GAD) loss function to align the angle information between optical and SAR images. In addition, to mitigate the negative impact of low-quality angle information on the network, we design an Adaptive Weighting Strategy (AWS) to guide the student network to prioritize high-quality angle information. The lack of high-quality oriented labels and objects in the OGSOD-1.0 dataset has hindered the progress in related fields. Therefore, we have added high-quality oriented labels and images to the OGSOD-1.0 dataset to construct a new dataset. Extensive experiments demonstrate the effectiveness and superiority of our proposed GaLD over existing methods. The dataset and code are available at: https://github.com/wchao0601/GaLD.
Loading