Causal Discovery from Conditionally Stationary Time SeriesDownload PDF

Published: 09 Jul 2022, Last Modified: 05 May 2023CRL@UAI 2022 PosterReaders: Everyone
Keywords: causal discovery, temporal inference, graph neural network, time series, non-stationary, probabilistic modelling
Abstract: Causal discovery, i.e., inferring underlying cause-effect relationships from observations, has been shown to be highly challenging for AI systems. In time series modeling context, existing causal discovery methods mainly consider constrained scenarios with fully observed variables and/or data from stationary time series. We develop a causal discovery approach to a wide class of non-stationary time series that are conditionally stationary, where the non-stationary behaviour is modeled as stationarity conditioned on a set of (possibly hidden) state variables whose dynamics may be dependent on the observed sequence. Named State-Dependent Causal Inference (SDCI), our approach is able to recover the underlying causal dependencies, provably with fully-observed states and empirically with hidden states. The latter is confirmed by experiments on both synthetic linear system and spring-connected particle interaction data, where SDCI achieves superior performance over baseline causal discovery methods.
4 Replies

Loading