Approximately Equivariant Recurrent Generative Models for Quasi-Periodic Time Series with a Progressive Training Scheme
Abstract: We present a simple yet effective generative model for time series, based on a Recurrent Variational Autoencoder that we refer to as RVAE-ST. Recurrent layers often struggle with unstable optimization and poor convergence when modeling long sequences. To address
these limitations, we introduce a progressive training scheme that gradually increases the sequence length, stabilizing optimization and enabling consistent learning over extended horizons. By composing known components into a recurrent, approximately time-shift-equivariant topology, our model introduces an inductive bias that aligns with the structure of quasi-periodic and nearly stationary time series. Across several benchmark datasets, RVAE-ST matches or surpasses state-of-the-art generative models, particularly on quasi-periodic data, while remaining competitive on more irregular signals. Performance is evaluated through ELBO, Fréchet Distance, discriminative metrics, and visualizations of the learned latent embeddings.
Submission Type: Long submission (more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=HQ9C9xcrWZ
Changes Since Last Submission: In this revised version, we have substantially improved the conceptual clarity, mathematical rigor, and empirical validation of our work in response to the reviewers’ comments.
Specifically, we made the following major changes:
Conceptual clarification:
We replaced the earlier notion of stationarity with the more appropriate concept of quasi-periodicity to better capture the properties of real-world time series used in our experiments. This resolves the earlier ambiguity between stationarity and time-shift equivariance.
Formal definition of time-shift equivariance:
We now provide a precise mathematical formulation of approximate time-shift equivariance and clarify how it differs from stationarity in both theoretical and practical terms. This addition strengthens the conceptual foundation of the paper.
Integration of the Echo State Property (ESP):
We introduced the ESP as a formal framework for analyzing the inductive bias of our recurrent architecture. The paper now includes both a theoretical discussion and empirical experiments demonstrating state contraction and forgetting behavior consistent with ESP. This directly addresses the reviewers’ request for evidence that the proposed architecture exhibits approximate equivariance in practice.
Improved mathematical presentation:
Several sections (including the methods and definitions) were rewritten for greater mathematical precision and readability, with clearer notation and explanations.
Together, these revisions address the core conceptual and empirical concerns raised by the reviewers, clarifying the theoretical motivation behind the model and strengthening the evidence for its inductive bias toward time-shift equivariance on quasi-periodic time series.
Assigned Action Editor: ~Andreas_Lehrmann1
Submission Number: 6580
Loading