JETS: A Self-Supervised Joint Embedding Time Series Foundation Model for Behavioral Data in Healthcare

Published: 23 Sept 2025, Last Modified: 04 Nov 2025TS4H NeurIPS 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: self-supervised learning, mamba, time series, representation learning, digital health
Abstract: Behavioral time series from wearable devices offer rich health insights but are often characterized by noise, missing values, and irregular sampling. While prior research on learning from physiological time series has focused on more dense, regular, and low-level sensor data, self-supervised pre-training on high-level, behavioral data remains a key challenge. We propose JETS (Joint Embedding for Time Series), a masked pre-training framework designed to address these challenges in behavioral time series. JETS was pre-trained on a large-scale, long-term dataset collected from real-world wearables, demonstrating robustness to the high degree of noise and incompleteness by distilling them in a learned latent space. When fine-tuned and evaluated on downstream, individual-level diagnostic prediction tasks, JETS outperforms established baselines, validating the effectiveness of joint-embedding architectures for ubiquitous behavioral data and paving the way for new applications in personalized digital health.
Submission Number: 12
Loading