Conditional Diffusion Models as Self-supervised Learning Backbone for Irregular Time Series

ICLR 2024 Workshop TS4H Submission12 Authors

Published: 08 Mar 2024, Last Modified: 01 Apr 2024TS4H PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Time Series, Diffusion Models, Irregular Time Series Classification, Health Data
TL;DR: We pretrain conditional diffusion models and use them as a backbone for downstream tasks.
Abstract: Irregular time series are ubiquitous in healthcare, with applications ranging from predicting patient health conditions to imputing missing values. Recent developments in conditional diffusion models, which predict missing values based on observed data, have shown significant promise for imputing regular time series. It also generalizes the self-supervised learning task of maskout reconstruction by replacing partial masking with injecting noise of variable scales to data and shows competitive results on image recognition. Despite the growing interest in diffusion models, their potential for irregular time series data, particularly in downstream tasks, remains underexplored. We propose a conditional diffusion model designed as a self-supervised learning backbone for such data, integrating a learnable time embedding and a cross-dimensional attention mechanism to address the data's complex temporal dynamics. This model not only suits conditional generation tasks naturally but also acquires hidden states beneficial for discriminative tasks. Empirical evidence demonstrates our model's superiority in both imputation and classification tasks.
Submission Number: 12
Loading