Boundary Guided Feature Fusion Network for Camouflaged Object Detection

Published: 01 Jan 2023, Last Modified: 12 Apr 2025PRCV (9) 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Camouflaged object detection (COD) refers to the process of detecting and segmenting camouflaged objects in an environment using algorithmic techniques. The intrinsic similarity between foreground objects and the background environment limits the performance of existing COD methods, making it difficult to accurately distinguish between the two. To address this issue, we propose a novel Boundary Guided Feature fusion Network (BGF-Net) for camouflaged object detection in this paper. Specifically, we introduce a Contour Guided Module (CGM) aimed at modeling more explicit contour features to improve COD performance. Additionally, we incorporate a Feature Enhancement Module (FEM) with the goal of integrating more discriminative feature representations to enhance detection accuracy and reliability. Finally, we present a Boundary Guided Feature Fusion Module (BGFM) to boost object detection capabilities and perform camouflaged object predictions. BGFM utilizes multi-level feature fusion for contextual semantic mining, we incorporate the edges extracted by the CGM into the fused features to further investigate semantic information related to object boundaries. By adopting this approach, we are able to better integrate contextual information, thereby improving the performance and accuracy of our model. We conducted extensive experiments, evaluating our BGF-Net method using three challenging datasets.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview