TimeAutoDiff: Generation of Heterogeneous Time Series Data via Latent Diffusion Model

ICLR 2025 Conference Submission2503 Authors

22 Sept 2024 (modified: 24 Nov 2024)ICLR 2025 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Time series data, Tabular data, Heterogeneous, Diffusion model, VAE, Generative model
TL;DR: We develop a time series tabular synthesizer, combining VAE and diffusion model.
Abstract: In this paper, we leverage the power of latent diffusion models to generate synthetic time series tabular data. Along with the temporal and feature correlations, the heterogeneous nature of the feature in the table has been one of the main obstacles in time series tabular data modeling. We tackle this problem by combining the ideas of the variational auto-encoder (VAE) and the denoising diffusion probabilistic model (DDPM). Our model named as \texttt{TimeAutoDiff} has several key advantages including (1) \textit{\textbf{Generality}}: the ability to handle the broad spectrum of time series tabular data with heterogeneous, continuous only, or categorical only features; (2) \textit{\textbf{Fast sampling speed}}: entire time series data generation as opposed to the sequential data sampling schemes implemented in the existing diffusion-based models, eventually leading to significant improvements in sampling speed, (3) \textit{\textbf{Time varying metadata conditional generation}}: the implementation of time series tabular data generation of heterogeneous outputs conditioned on heterogenous, time varying features, enabling scenario exploration across multiple scientific and engineering domains. (4) \textit{\textbf{Good fidelity and utility guarantees}}: numerical experiments on eight publicly available datasets demonstrating significant improvements over state-of-the-art models in generating time series tabular data, across four metrics measuring fidelity and utility; Codes for model implementations are available at the supplementary materials.
Supplementary Material: zip
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2503
Loading