Improving the Ability of Deep Neural Networks to Use Information From Multiple Views in Breast Cancer ScreeningDownload PDF

Jan 25, 2020 (edited Jun 24, 2020)MIDL 2020 Conference Blind SubmissionReaders: Everyone
  • Keywords: Breast cancer screening, deep neural networks, multimodal learning, multiview learning
  • TL;DR: In breast cancer screening, we encourage deep neural networks to utilize better information in two views of the breast by studying and improving the learning dynamics
  • Track: full conference paper
  • Paper Type: methodological development
  • Abstract: In breast cancer screening, radiologists make the diagnosis based on images that are taken from two angles. Inspired by this, we seek to improve the performance of deep neural networks applied to this task by encouraging the model to use information from both views of the breast. First, we took a closer look at the training process and observed an imbalance between learning from the two views. In particular, we observed that layers processing one of the views have parameters with larger gradients in magnitude, and contribute more to the overall loss reduction. Next, we tested several methods targeted at utilizing both views more equally in training. We found that using the same weights to process both views, or using modality dropout, leads to a boost in performance. Looking forward, our results indicate improving learning dynamics as a promising avenue for improving utilization of multiple views in deep neural networks for medical diagnosis.
  • Source Latex: zip
  • Presentation Upload: zip
  • Presentation Upload Agreement: I agree that my presentation material (videos and slides) will be made public.
14 Replies