Time Series Kernels based on Nonlinear Vector AutoRegressive Delay Embeddings

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: Time Series, Kernel methods, NVAR processes, Dynamical systems, Reservoir Computing
TL;DR: We introduce a new kernel for univariate and multivariate time series that compares the linear dynamics extracted from NVAR delay embeddings and overcomes downsides of Reservoir Computing based approaches.
Abstract: Kernel design is a pivotal but challenging aspect of time series analysis, especially in the context of small datasets. In recent years, Reservoir Computing (RC) has emerged as a powerful tool to compare time series based on the underlying dynamics of the generating process rather than the observed data. However, the performance of RC highly depends on the hyperparameter setting, which is hard to interpret and costly to optimize because of the recurrent nature of RC. Here, we present a new kernel for time series based on the recently established equivalence between reservoir dynamics and Nonlinear Vector AutoRegressive (NVAR) processes. The kernel is non-recurrent and depends on a small set of meaningful hyperparameters, for which we suggest an effective heuristic. We demonstrate excellent performance on a wide range of real-world classification tasks, both in terms of accuracy and speed. This further advances the understanding of RC representation learning models and extends the typical use of the NVAR framework to kernel design and representation of real-world time series data.
Supplementary Material: zip
Submission Number: 13975
Loading