Keywords: Functional Time Series, Transformers, Representation Learning, Unstructured Data, Yield Curve Forecasting
TL;DR: Unifying functional, temporal, and unstructured data into a single representation learning framework, demonstrated through yield curve forecasting but broadly applicable to other domains.
Abstract: This paper presents a unified framework for learning representations from data that combine three challenging aspects: continuous functional structure, discrete temporal dynamics, and unstructured vector inputs. Our proposed models extend attention mechanisms to handle entire functions along a continuous axis, capture evolution over time, and integrate irregular or unordered features through modality fusion. This design enables the network to respect smoothness, temporal dependence, and heterogeneity while learning unified representations across modalities. As a motivating application, we study yield curve forecasting, where inputs may be full yield curves or bond-level trade data and outputs are predicted curves. Beyond forecasting, the framework is broadly applicable; for instance, to reinforcement learning tasks where policies evolve over time, or to generative models that produce structured functional outputs. Empirical results on bond datasets show consistent gains over classical econometric models, underscoring the potential of our approach as a general blueprint for representation learning with functional–temporal–unstructured data.
Primary Area: learning on time series and dynamical systems
Submission Number: 23115
Loading