A Comprehensive Benchmark of Supervised and Self-supervised Pre-training on Multi-view Chest X-ray Classification

Published: 06 Jun 2024, Last Modified: 06 Jun 2024MIDL 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Multi-view X-ray, self-supervised, pre-training
Abstract: Chest X-ray analysis in medical imaging has largely focused on single-view methods. However, recent advancements have led to the development of multi-view approaches that harness the potential of multiple views for the same patient. Although these methods have shown improvements, it is especially difficult to collect large multi-view labeled datasets owing to the prohibitive annotation costs and acquisition times. Hence, it is crucial to address the multi-view setting in the low data regime. Pre-training is a critical component to ensure efficient performance in this low data regime, as evidenced by its improvements in natural and medical imaging. However, in the multi-view setup, such pre-training strategies have received relatively little attention and ImageNet initialization remains largely the norm. We bridge this research gap by conducting an extensive benchmarking study illustrating the efficacy of 10 strong supervised and self-supervised models pre-trained on both natural and medical images for multi-view chest X-ray classification. We further examine the performance in the low data regime by training these methods on 1%, 10%, and 100% fractions of the training set. Moreover, our best models yield significant improvements compared to existing state-of-the-art multi-view approaches, outperforming them by as much as 9.9%, 8.8% and 1.6% on the 1%, 10%, and 100% data fractions respectively. We hope this benchmark will spur the development of stronger multi-view medical imaging models, similar to the role of such benchmarks in other computer vision and medical imaging domains. As open science, we make our code publicly available to aid in the development of stronger multi-view models.
Latex Code: zip
Copyright Form: pdf
Submission Number: 150
Loading