Keywords: Diffusion models, sequential data, adaptiveness, score approximation
Abstract: Generating realistic synthetic sequential data is critical in real-world applications such as operations research and finance. While diffusion models have achieved remarkable success in generating static data, their direct extensions to sequential settings often fail to capture temporal dependencies and information structure. Designing diffusion models that can simulate sequential data in an adaptive, non-anticipative manner therefore remains an open challenge.
In this work, we propose a sequential forward–backward diffusion framework for adapted time series generation. Our approach progressively injects and removes noise along the sequence, by conditioning on the previously generated history to ensure adaptiveness. We further introduce a novel score-matching objective for efficient parallel training. Finally, we establish a score approximation result using transformer networks as an early step towards a statistical estimation theory.
Submission Number: 49
Loading