From Leads to Latents: Attention-Driven Masked Autoencoder for ECG Times Series

Published: 02 Mar 2026, Last Modified: 02 Mar 2026ICLR 2026 Workshop GRaM PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: tiny paper (up to 4 pages)
Keywords: Self-supervised learning, masked autoencoders, attention mechanisms, representation learning, multilead ECG modeling
TL;DR: LAMAE pretrains on 12-lead ECGs with latent attention to exploit cross-lead redundancy. This structure-aware MAE learns better representations, improving ICD-10 prediction, especially with limited labels and ECG-salient diagnoses.
Abstract: Electrocardiograms (ECGs) are among the most widely available clinical signals and play a central role in cardiovascular diagnosis. While recent foundation models have shown promise for learning transferable ECG representations, most existing pretraining approaches treat leads as independent channels and fail to explicitly leverage their strong structural redundancy. We introduce the latent attention masked autoencoder (LAMAE) framework that directly exploits this structure by learning cross-lead connection mechanisms during self-supervised pretraining. Our approach models higher-order interactions across leads through latent attention, enabling permutation-invariant aggregation and adaptive weighting of lead-specific representations. We provide empirical evidence on the Mimic-IV-ECG database that leveraging the cross-lead connection constitutes an effective form of structural supervision, improving representation quality and transferability. Our method shows strong performance on the difficult task of predicting ICD-10 codes, outperforming independent-lead masked modeling and alignment-based baselines.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Presenter: ~Samuel_Ruiperez-Campillo1
Format: Yes, the presenting author will attend in person if this work is accepted to the workshop.
Funding: No, the presenting author of this submission does *not* fall under ICLR’s funding aims, or has sufficient alternate funding.
Serve As Reviewer: ~Samuel_Ruiperez-Campillo1
Submission Number: 128
Loading