NeRT: Implicit Neural Representation for Time Series

19 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Implicit neural representation, time-series
Abstract: Time series is one of the most fundamental data types in real-world environments and there have been many different deep-learning models to effectively handle time series data, ranging from recurrent neural networks to Transformers to differential equation-based models. These existing models, however, tend to underperform due to irregular measurements, sensitivity to hyper-parameters (e.g., a window size), to name a few. Modeling time series as a continuous-in-time signal via implicit neural representations (INRs) can be an alternative approach to overcome such limitations. However, naïve adoptions of existing INR frameworks toward time series do not yield promising outcomes. To address this, we propose NeRT, a novel class of INRs tailored to handle time-series data; the core ideas are to design a new coordinate system, to employ learnable Fourier features, and to model periodic and scale components of time series separately. Thanks to the inherent characteristics of INRs, our model can learn from both regular and irregular time series in a continuous-time manner and perform time series forecasting and imputation at the same time with a single trained model. Moreover, we show that NeRT can be efficiently parameterized via latent modulation. Through extensive experiments with real-world and scientific datasets, we demonstrate that NeRT significantly outperforms baselines including popular INR-based methods and previous time series models.
Supplementary Material: zip
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1852
Loading