Less Is More: Generating Time Series with LLaMA-Style Autoregression in Simple Factorized Latent Spaces
Keywords: Generative Model, Time Series Generation, Autoregressive Transformer
Abstract: Generative models for multivariate time series are essential for data augmentation, simulation, and privacy preservation, yet current state-of-the-art diffusion-based approaches are slow and limited to fixed-length windows. We propose FAR-TS, a simple yet effective framework that combines disentangled Factorization with an AutoRegressive Transformer over a discrete, quantized latent space to generate Time Series. Each time series is decomposed into a data-adaptive basis that captures static cross-channel correlations and temporal coefficients that are vector-quantized into discrete tokens. A LLaMA-style autoregressive Transformer then models these token sequences, enabling fast and controllable generation of sequences with arbitrary length. Owing to its streamlined design, FAR-TS achieves orders-of-magnitude faster generation than Diffusion-TS while preserving cross-channel correlations and an interpretable latent space, enabling high-quality and flexible time series synthesis.
Primary Area: learning on time series and dynamical systems
Submission Number: 18607
Loading