Combine synergetic approach with multi-scale feature fusion for Boosting Abdominal Multi-Organ and Pan-Cancer Segmentation

09 Sept 2023 (modified: 22 Dec 2023)Submitted to FLARE 2023EveryoneRevisionsBibTeX
Keywords: Deep learning, Abdominal organ segmentation, Feature fusion, Tumor segmentation
TL;DR: Our two-stage segmentation enhances abdominal image segmentation, using a coarse network for efficient localization. In the second stage, a multi-scale fusion module improves accuracy, addressing computational and boundary challenges.
Abstract: Due to the capability of abdominal images to accurately represent the spatial distribution and size relationships of lesion components in the body, precise segmentation of these images can significantly assist doctors in diagnosing illnesses. To address issues such as high computational resource consumption and inaccurate boundary delineation, we propose a two-stage segmentation framework with multi-scale feature fusion. This approach aims to enhance segmentation accuracy while reducing computational complexity. In the initial stage, a coarse segmentation network is employed to identify the location of segmentation targets with minimal computational overhead.Subsequently, in the second stage, we introduce a multi-scale feature fusion module that incorporates cross-layer connectivity. This method enhances the network's context-awareness capabilities and improves its ability to capture boundary information of intricate medical structures. Our proposed method has achieved notable results, with an average Dice Similarity Coefficient (DSC) score of 85.60\% and 37.26\% for organs and lesions, respectively, on the validation set. Additionally, the average running time and area under the GPU memory-time curve are reported as 11 seconds and 24,858.1 megabytes, demonstrating the efficiency and effectiveness of our approach in both accuracy and resource utilization.
Supplementary Material: zip
Submission Number: 11
Loading