MoCo Pretraining Improves Representation and Transferability of Chest X-ray ModelsDownload PDF

Published: 31 Mar 2021, Last Modified: 16 May 2023MIDL 2021Readers: Everyone
Keywords: Radiology, Chest X-Ray, Contrastive Learning, Transfer Learning
TL;DR: MoCo-pretraining provides high-quality representations and transferable initializationsfor chest X-ray interpretation.
Abstract: Contrastive learning is a form of self-supervision that can leverage unlabeled data to produce pretrained models. While contrastive learning has demonstrated promising results on natural image classification tasks, its application to medical imaging tasks like chest X-ray interpretation has been limited. In this work, we propose MoCo-CXR, which is an adaptation of the contrastive learning method Momentum Contrast (MoCo), to produce models with better representations and initializations for the detection of pathologies in chest X-rays. In detecting pleural effusion, we find that linear models trained on MoCo-CXR-pretrained representations outperform those without MoCo-CXR-pretrained representations, indicating that MoCo-CXR-pretrained representations are of higher-quality. End-to-end fine-tuning experiments reveal that a model initialized via MoCo-CXR-pretraining outperforms its non-MoCo-CXR-pretrained counterpart. We find that MoCo-CXR-pretraining provides the most benefit with limited labeled training data. Finally, we demonstrate similar results on a target Tuberculosis dataset unseen during pretraining, indicating that MoCo-CXR-pretraining endows models with representations and transferability that can be applied across chest X-ray datasets and tasks.
Registration: I acknowledge that publication of this at MIDL and in the proceedings requires at least one of the authors to register and present the work during the conference.
Authorship: I confirm that I am the author of this work and that it has not been submitted to another publication before.
Paper Type: validation/application paper
Primary Subject Area: Application: Radiology
Secondary Subject Area: Detection and Diagnosis
Source Code Url: https://github.com/stanfordmlgroup/MoCo-CXR
Source Latex: zip
10 Replies

Loading