SoK: The Great GAN Bake Off, An Extensive Systematic Evaluation of Generative Adversarial Network Architectures for Time Series SynthesisDownload PDF

Anonymous

03 Feb 2022 (modified: 05 May 2023)Submitted to JSYS Feb 22Readers: Everyone
Keywords: GAN, time series, synthesis, evaluation, time series generation
Abstract: There is no standard approach to compare the success of different neural network architectures utilized for time series synthesis. This hinders the evaluation and decision, which architecture should be leveraged for an unknown data set. We propose a combination of metrics, which empirically evaluate the performance of neural network architectures trained for time series synthesis. With these measurements we are able to account for temporal correlations, spatial correlations and mode collapse issues within the generated time series. We further investigate the interaction of different generator and discriminator architectures between each other. The considered architectures include recurrent neural networks, temporal convolutional networks and transformer-based networks. So far, the application of transformer-based models is limited for time series synthesis. Hence, we propose a new transformer-based architecture, which is able to synthesise time series. We evaluate the proposed architectures and their comobinations in over 500 experiments, amounting to over 2500 computing hours. We provide results for four data data sets, one univariate and three multivariate. The data sets vary with regard to length, patterns in temporal and spatial correlations. One of the multivariate data sets is an artificial data set constructed in a conditional setup. We use our metrics to compare the performance of generative adversarial network architectures for time series synthesis. To verify our findings we utilize quantitative and qualitative evaluations. Our results indicate that temporal convolutional networks outperform recurrent neural network and transformer based approaches with regard to fidelity and flexibility of the generated data. Temporal convolutional network architecture are the most stable architecture for a mode collapse prone data set. The performance of the transformer models strongly depends on the data set characteristics, it struggled to synthesise data sets with high temporal and spatial correlations. Discriminators with recurrent network architectures suffered immensely from vanishing gradients. We also show, that the performance of the generative adversarial networks depends more on the discriminator part rather than the generator part.
Area: Data Science and Reproducibility
Type: Systemization of Knowledge (SoK)
3 Replies

Loading