Keywords: Scarce data, Ejection fraction, Echocardiography, Parasternal Long-Axis (PLAX), Video View Classification, Proxy Labeling, PhysioNet MIMIC Dataset
TL;DR: We leveraged scarce public echocardiography data to develop a large PLAX dataset with EF labels by transferring EF predictions from A4C models, achieving a 7.15% MAE for EF prediction from PLAX views in a ground truth test dataset we established.
Abstract: We developed a machine learning model to predict left ventricular ejection fraction (LVEF/EF) from parasternal long-axis (PLAX) echocardiographic videos. Because public datasets with labeled PLAX videos are virtually non-existent, our work focuses on an innovative data generation strategy to overcome this scarcity. By leveraging a time-based correlation between clinical notes and echocardiographic videos, combined with fine-tuning view classifiers and proxy labeling, we effectively created a large labeled PLAX dataset and achieved a mean absolute error (MAE) of 7.15%. Given that Apical four-chamber methods, the clinical standard, report MAE values of 6%-7%, our results demonstrate that EF estimation from PLAX views is both feasible and clinically relevant. This surpasses the performance of existing methods and provides a clinically useful solution for situations where apical views may not be feasible. The EF labels for PLAX videos, derived from publicly available datasets, are accessible at https://github.com/Jeffrey4899/PLAX_EF_Labels_202501.
Primary Subject Area: Learning with Noisy Labels and Limited Data
Secondary Subject Area: Application: Cardiology
Paper Type: Both
Registration Requirement: Yes
Submission Number: 205
Loading