Towards Information-Theoretic Pattern Mining in Time SeriesDownload PDF

22 Sept 2022 (modified: 13 Feb 2023)ICLR 2023 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Deep learning, Unspervised learning, Variational Autoencoders
TL;DR: The paper offers a novel method for discovering patterns from time series data.
Abstract: Time series pattern discovery is one of the most fundamental tasks in data mining. Existing literature addressing this task often follows a generic paradigm in which a similarity metric is defined a priori and an extensive pattern-matching search is executed to find similar subsequences based on the metric. Algorithms developed under this paradigm therefore risk missing important patterns that do not meet the implicit biases within such pre-defined metrics. To mitigate this risk, we propose a new information-theoretic discovery paradigm that aims to find the most informative patterns on an embedding space that can learn to encode representative statistical variation trends in the time series. This paradigm is achieved by a probabilistic time-to-pattern mining algorithm, T2P, based on a biophysically-inspired adaptation of a variational auto-encoder (VAE). The adapted VAE incorporates a specific design for its latent space that learns to surface the most recurring and informative patterns without the need to run costly pattern-matching searches. Empirically, we demonstrate that our method is more scalable than existing works. Furthermore, T2P can find multiple diverse patterns that more effectively compress and represent the time series without relying on prior knowledge of the data.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Unsupervised and Self-supervised learning
Supplementary Material: zip
6 Replies

Loading