CLARE-GAN: GENERATION OF CLASS-SPECIFIC TIME SERIESDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Generative Adversarial Networks, Time Series
Abstract: Recently, through numerous works Esteban et al. (2017); Mogren (2016); Yoon et al. (2019), attempts were made to obtain generative models for time series that correctly reproduce the underlying temporal characteristics of a given training dataset. However, we prove in this work that the performance of these models is limited on datasets with high-variability for example containing different classes. In such setups, it is extremely difficult for a generative model to find the right trade-off between sample fidelity i.e. their similarity to the real time series and sample diversity. Furthermore, it is essential to preserve the original classes and the variation within each class. To tackle this issue, we propose a new generative class sensitive model, Class-specific Recurrent GAN (CLaRe-GAN), that conditions the generator on an auxiliary information containing the class-specific and class-independent attributes. Our model relies on class specific encoders: a unique encoder for two contradictory functionalities i.e. extracting the inter- and intra-class attributes. To extract the high-level representation of the time series, we make a shared-latent space assumption Liu et al. (2017). At the same time, we use a class discriminator that discriminates between the latent vectors to efficiently extract the class-specific attributes. We test our approach on a set of publicly available datasets where the number of classes, the length and the number of available times series for each class varies and evaluate our approach both visually and computationally. We prove that our model outperforms the state-of-the-art generative models and leads to a significant and consistent improvement in the quality of the generated time series while at the same time preserving the classes and the variation of the original dataset.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Supplementary Material: zip
Reviewed Version (pdf): https://openreview.net/references/pdf?id=jrvyBLq8Gw
5 Replies

Loading