Spatial Matching Loss Function for Mass Segmentation on Whole Mammography Images

23 Sept 2023 (modified: 27 Feb 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Mass Segmentation, Mammography, AU-Net, Loss Function, Spatial Matching Loss
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: New framework for calculating loss for mass segmentation to incorporate higher level spatial information in the loss.
Abstract: Breast cancer is one of the cancer types with a high mortality rate among women, and mammography is one of the primary means to improve the identification of breast cancer. Deep-learning-based approaches are among the pioneering methods for mass segmentation in mammography images; in this category of methods, the loss function is one of the core elements. Most of the proposed losses aim to measure pixel-level similarities. While the hard-coded location information is provided in these losses, they mostly neglect to consider higher-level information such as relative distance, sizes, and quantities, which are important for mass segmentation. Motivated by this observation, in this paper we propose a framework for loss calculation in mass segmentation for mammography images that incorporates the higher-level spatial information in the loss by spatial matching between the prediction and the ground truth masks while calculating the loss. The proposed loss calculation framework is termed Spatial Matching (SM) loss. Instead of only calculating the loss over the entire masks that captures the similarity of the segmentation and the ground truth only at the pixel level, SM loss also compares the two in cells in a grid that enables the loss to measure higher-level similarities in the locations, sizes, and quantities. The grid size is selected according to each sample, which enables the method to consider the variation in mass sizes. In this study, Binary Cross Entropy (BCE) and Tversky are used as the core loss in experiments for the SM loss. AU-Net is selected as the baseline approach. We tested our method on the INbreast dataset. The results of our experiments show a significant boost in the performance of the baseline method while outperforming state-of-the-art mass segmentation methods.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6932
Loading