iHyperTime: Interpretable Time Series Generation with Implicit Neural Representations

TMLR Paper2332 Authors

04 Mar 2024 (modified: 11 Mar 2024)Under review for TMLREveryoneRevisionsBibTeX
Abstract: Implicit neural representations (INRs) have emerged as a powerful tool that provides an accurate and resolution-independent encoding of data. Their robustness as general approximators has been shown across diverse data modalities, such as images, video, audio, and 3D scenes. However, little attention has been given to leveraging these architectures for time series data. Addressing this gap, we propose an approach for time series generation based on two novel architectures: TSNet, an INR network for interpretable trend-seasonality time series representation, and iHyperTime, a hypernetwork architecture that leverages TSNet for time series generalization and synthesis. Through evaluations of fidelity and usefulness metrics, we demonstrate that iHyperTime outperforms current state-of-the-art methods in challenging scenarios that involve long or irregularly sampled time series, while performing on par on regularly sampled data. Furthermore, we showcase iHyperTime fast training speed, comparable to the fastest existing methods for short sequences and significantly superior for longer ones. Finally, we empirically validate the quality of the model's unsupervised trend-seasonality decomposition by comparing against the well-established STL method.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Grigorios_Chrysos1
Submission Number: 2332
Loading