Boundary-Aware Feature Fusion With Dual-Stream Attention for Remote Sensing Small Object Detection

Published: 01 Jan 2025, Last Modified: 12 Apr 2025IEEE Trans. Geosci. Remote. Sens. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Detecting small objects in remote sensing images poses significant challenges to the field of computer vision, primarily stemming from the complexity of backgrounds, limitations in pixel resolution, and information loss during the feature fusion process. While general object detection has significantly advanced in recent years, remote sensing small object detection remains an unsolved problem, with existing frameworks struggling to achieve high performance at small scales. In this article, we propose a novel framework called the boundary-aware feature fusion network (BAFNet), which significantly enhances the model’s ability to represent and locate small objects precisely within complex remote sensing scenarios. First, a dual-stream attention fusion module captures complementary foreground and background cues through bidirectional context modeling. Jointly attending to objects and their surroundings enhances discriminative power for distinguishing small objects. Additionally, we incorporate a boundary-aware branch to better preserve crucial detailed information vital for small-scale objects. This auxiliary component supervises the fusion of contextual semantics and spatial information, aiding in retaining critical boundary details that are prone to loss during cross-layer feature fusion. We conducted experiments on the challenging AI-TOD, VisDrone, DIOR, and LEVIR-Ship datasets. The results demonstrate the superiority of our approach over other state-of-the-art (SOTA) object detection methods, particularly in terms of precisely identifying small objects within remote sensing images. The code is available at https://github.com/ooo1128/BAFNet.
Loading