TL;DR: NormWear: A Normative Foundation Model that Learns Representations of Multimodal Wearable Sensing Signals for Diverse Digital Healthcare Applications
Abstract: Time-series foundation models excel at tasks like forecasting across diverse data types by leveraging informative waveform representations. Wearable sensing data, however, pose unique challenges due to their variability in patterns and frequency bands, especially for healthcare-related outcomes. The main obstacle lies in crafting generalizable representations that adapt efficiently across heterogeneous sensing configurations and applications. To address this, we propose NormWear, a foundation model designed to extract generalized and informative representations from wearable sensing data. NormWear is pretrained on a diverse set of physiological signals, including PPG, ECG, EEG, GSR, and IMU, from various public datasets. For evaluation, we benchmark its performance across 11 public wearable sensing datasets, spanning 18 applications in mental health, body state inference, biomarker estimation, and disease risk evaluation, demonstrating superior performance compared to competitive baselines. Additionally, using a novel representation-alignment-match method, we align physiological signal embeddings with text embeddings, enabling zero-shot inference for unseen wearable signal-based health applications.
Primary Area: Deep Learning->Foundation Models
Keywords: Foundation Model, Signal Processing, Time Series, Wearable Sensing, Digital Healthcare
Application-Driven Machine Learning: This submission is on Application-Driven Machine Learning.
Submission Number: 9339
Loading