Context-invariant, multi-variate time series representationsDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: time series, representation learning, contrastive learning, context invariance, anomaly detection, domain adversarial learning
Abstract: Modern time series corpora, in particular those coming from sensor-based data, exhibit characteristics that have so far not been adequately addressed in the literature on representation learning for time series. In particular, such corpora often allow to distinguish between \emph{exogenous} signals that describe a context which influences a given appliance and \emph{endogenous} signals that describe the internal state of the appliance. We propose a temporal convolution network based embedding that improves on the state-of-the-art by incorporating recent advances in contrastive learning to the time series domain and by adopting a multi-resolution approach. Employing techniques borrowed from domain-adversarial learning, we achieve an invariance of the embeddings with respect to the context provided by the exogenous signal. To show the effectiveness of our approach, we contribute new data sets to the research community and use both new as well as existing data sets to empirically verify that we can separate normal from abnormal internal appliance behaviour independent of the external signals in data sets from IoT and DevOps.
One-sentence Summary: We present embeddings for time series that are invariant with respect to a given (time series) context and enable down-stream applications such as data exploration or contextual anomaly detection.
Supplementary Material: zip
13 Replies