Unsupervised Domain Adaptation for Medical Image Segmentation via Self-Training of Early FeaturesDownload PDF

Published: 28 Feb 2022, Last Modified: 16 May 2023MIDL 2022Readers: Everyone
Keywords: Unsupervised Domain Adaptation, Segmentation
Abstract: U-Net models provide a state-of-the-art approach for medical image segmentation, but their accuracy is often reduced when training and test images come from different domains, such as different scanners. Recent work suggests that, when limited supervision is available for domain adaptation, early U-Net layers benefit the most from a refinement. This motivates our proposed approach for self-supervised refinement, which does not require any manual annotations, but instead refines early layers based on the richer, higher-level information that is derived in later layers of the U-Net. This is achieved by adding a segmentation head for early features, and using the final predictions of the network as pseudo-labels for refinement. This strategy reduces detrimental effects of imperfection in the pseudo-labels, which are unavoidable given the domain shift, by retaining their probabilistic nature and restricting the refinement to early layers. Experiments on two medical image segmentation tasks confirm the effectiveness of this approach, even in a one-shot setting, and compare favorably to a baseline method for unsupervised domain adaptation.
Registration: I acknowledge that publication of this at MIDL and in the proceedings requires at least one of the authors to register and present the work during the conference.
Authorship: I confirm that I am the author of this work and that it has not been submitted to another publication before.
Paper Type: methodological development
Primary Subject Area: Transfer Learning and Domain Adaptation
Secondary Subject Area: Segmentation
Confidentiality And Author Instructions: I read the call for papers and author instructions. I acknowledge that exceeding the page limit and/or altering the latex template can result in desk rejection.
Code And Data: Code: https://github.com/ferasha/UDAS Data: https://sites.google.com/view/calgary-campinas-dataset/download https://www.ub.edu/mnms/
5 Replies

Loading