Abstract: Ordinary Differential Equations (ODEs) based models have become popular as foundation models for solving many time series problems. Combining neural ODEs with traditional RNN models has provided the best representation for irregular time series. However, ODEs-based models typically require the trajectory of hidden states to be defined based on either the initial observed value or the most recent observation, raising questions about their effectiveness when dealing with longer sequences and extended time intervals. In this article, we explore the behaviour of the ODEs-based models in the context of time series data with varying degrees of sparsity. We introduce SeqLink, an innovative neural architecture designed to enhance the robustness of sequence representation. Unlike traditional approaches that solely rely on the hidden state generated from the last observed value, SeqLink leverages ODE latent representations derived from multiple data samples, enabling it to generate robust data representations regardless of sequence length or data sparsity level. The core concept behind our model is the definition of hidden states for the unobserved values based on the relationships between samples (links between sequences). Through extensive experiments on partially observed synthetic and real-world datasets, we demonstrate that SeqLink improves the modelling of intermittent time series, consistently outperforming state-of-the-art approaches.
Submission Length: Long submission (more than 12 pages of main content)
Changes Since Last Submission: [EiC] Upon the authors' request, we have uploaded a new pdf in which the name of the model is updated in Figure 5 from "CrossPyramid" to "SeqLink."
Code: https://github.com/FtoonAbushaqra/SeqLink.git
Assigned Action Editor: ~Adam_Arany1
Submission Number: 2422
Loading