Abstract: Temporal causal discovery aims to uncover causal relations in time series data. Current deep learning-based methods usually analyze the parameters of some components of the trained models, which is an incomplete mapping process from the model parameters to the causality and fails to investigate the other components. To address this, this paper presents an interpretable transformer-based causal discovery model termed CausalFormer, which consists of: 1) the causality-aware transformer which learns the causal representation with the multi-kernel causal convolution under the temporal priority constraint, and 2) the decomposition-based causality detector which identifies causality by interpreting the global structure of the trained transformer with the regression relevance propagation.
Loading