Dynamic-Aware GANs: Time-Series Generation with Handy Self-SupervisionDownload PDF

22 Sept 2022 (modified: 13 Feb 2023)ICLR 2023 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Time-series modelling, Self-supervision, Deep Generative Models
TL;DR: This paper presents Dynamic-Aware GANs as a data-efficient self-supervised paradigm for time-series data generation.
Abstract: This paper presents Dynamic-Aware GAN (DAGAN) as a data-efficient self-supervised paradigm for time-series data generation. To support sequential generation with sufficient clues of temporal dynamics, we explicitly model the transition dynamics within the data sequence through differencing, thus refining the vanilla sequence into one with inter-correlated triplets to characterize each time-step. This localized triplet consistent structure contributes to a self-supervision mechanism, which can provide more aspects of supervision for the overall stepwise dependencies encoded within the training data. Such a handy self-supervision mechanism is simple but can be beneficial especially when a model is presented with limited training data. Based on the insight, we present DAGAN which generalizes the locally regularized triplet consistency to distributional-level via dynamic encoding and joint distribution matching. Experiments on various synthetic and real-world datasets verify that our model achieves superior generation results with better quality and diversity compared with the state-of-the-art benchmarks, especially when the training data is scarce. Moreover, benefited from the dynamic-conditional and dynamic-consistent design, our DAGAN is capable of generating sequences that present specified dynamics.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Unsupervised and Self-supervised learning
Supplementary Material: zip
4 Replies

Loading