Segment Any Cancer in CT scans through equipping SAM with cross-slice interaction and indicator prompt
Keywords: Pan-cancer segmentation, SAM, Auto prompt, Foundation model, Cross-slice interaction
Abstract: CT scanning has become a commonly used imaging method in cancer diagnosis, displaying the anatomical structure of the human body in detail. However, different types of cancer have varying manifestations in CT imaging, posing great challenges for pan-cancer segmentation in CT scans. Recently, SAM has gradually become a landmark in medical image segmentation due to its powerful generality and generalization, providing a new paradigm for universal segmentation. However, current SAM-based segmentation approaches have weak detail perception ability, heavy dependence on manual prompts, and lack of 3D feature interaction. To address these, we have developed a universal cancer segmentation model for CT scans based on the extraordinary segmentation paradigm of SAM. Specifically, a 3D CNN-based U-shaped image encoder and a cross-branch interaction module are developed to increase the detail feature capture and spatial feature interaction of SAM. Besides, a cancer indicator prompt encoder is introduced to remove the dependence of SAM-based approaches on manual prompts. To fully utilize the advantages of SAM-based universal segmentation models and UNet-based specific segmentation models, we have comprehensively considered the prediction results of both, further reducing false positives and omissions in pan-cancer segmentation. In addition, to fully utilize partially annotated data for specific cancers, we use a combination of pseudo labels and partial labels to generate fully annotated data, effectively avoiding data conflict issues. Our method achieved an average score of 30\% and 22\% for the lesion DSC and NSD on the validation set and the average running time and area under GPU memory-time curve are 18s and 38960MB, respectively.
Submission Number: 5
Loading