Towards Self-Supervised Foundation Models for Critical Care Time Series

Published: 23 Sept 2025, Last Modified: 01 Dec 2025TS4H NeurIPS 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Electronic Health records, Deep Learning, Foundation Models, Time series analysis, Self-supervised Learning
TL;DR: We built early-stage foundation models for ICU time series data that outperforms baseline models for mortality prediction when labeled training data is scarce
Abstract: Domain-specific foundation models for healthcare have expanded rapidly in recent years, yet foundation models for critical care time series remain relatively underexplored due to the limited size and availability of datasets. In this work, we introduce early-stage pre-trained foundation models for critical care time series based on the Bi-Axial Transformer (BAT), trained on pooled electronic health record datasets. We demonstrate effective transfer learning by fine-tuning the models on a dataset distinct from the training sources for mortality prediction, where it outperforms supervised baselines, particularly for smaller datasets. These contributions highlight the potential of self-supervised foundation models for critical care times series to support generalizable and robust clinical applications in resource-limited settings. Code Availability: https://github.com/Katja-Jagd/YAIB
Submission Number: 122
Loading