Robust medical image segmentation by adapting neural networks for each test imageDownload PDF

Apr 06, 2021 (edited Apr 20, 2021)MIDL 2021 Conference Short SubmissionReaders: Everyone
  • Keywords: medical image segmentation, cross-scanner robustness, domain generalization
  • TL;DR: A method to make CNNs more robust to scanner and protocol changes, by adapting them for each test image.
  • Abstract: Performance of convolutional neural networks (CNNs) used for medical image analyses degrades markedly when training and test images differ in terms of their acquisition details, such as the scanner model or the protocol. We tackle this issue for the task of image segmentation by adapting a CNN ($C$) for each test image. Specifically, we design $C$ as a concatenation of a shallow normalization CNN ($N$), followed by a deep CNN ($S$) that segments the normalized image. At test time, we adapt $N$ for each test image, guided by an implicit prior on the predicted labels, which is modelled using an independently trained denoising autoencoder ($D$). The method is validated on multi-center MRI datasets of 3 anatomies. This article is a short version of the journal paper~\cite{karani2021test}.
  • Paper Type: both
  • Primary Subject Area: Segmentation
  • Secondary Subject Area: Transfer Learning and Domain Adaptation
  • Paper Status: based on accepted/submitted journal paper
  • Source Code Url: https://github.com/neerakara/test-time-adaptable-neural-networks-for-domain-generalization
  • Data Set Url: https://www.humanconnectome.org/, http://fcon_1000.projects.nitrc.org/indi/abide/, https://wiki.cancerimagingarchive.net/display/Public/NCI-ISBI+2013+Challenge+-+Automated+Segmentation+of+Prostate+Structures, https://promise12.grand-challenge.org/
  • Registration: I acknowledge that publication of this at MIDL and in the proceedings requires at least one of the authors to register and present the work during the conference.
  • Authorship: I confirm that I am the author of this work and that it has not been submitted to another publication before.
4 Replies

Loading