Causal Discovery Using Regression-Based Conditional Independence TestsOpen Website

2017 (modified: 04 Sept 2019)AAAI 2017Readers: Everyone
Abstract: Conditional independence (CI) testing is an important tool in causal discovery. Generally, by using CI tests, a set of Markov equivalence classes w.r.t. the observed data can be estimated by checking whether each pair of variables x and y is d -separated, given a set of variables Z. Due to the curse of dimensionality, CI testing is often difficult to return a reliable result for high-dimensional Z. In this paper, we propose a regression-based CI test to relax the test of x ⊥ y | Z to simpler unconditional independence tests of x − f ( Z ) ⊥ y − g ( Z ), and x − f ( Z ) ⊥ Z or y − g ( Z ) ⊥ Z under the assumption that the data-generating procedure follows additive noise models (ANMs). When the ANM is identifiable, we prove that x − f ( Z ) ⊥ y − g ( Z ) ⇒ x ⊥ y | Z . We also show that 1) f and g can be easily estimated by regression, 2) our test is more powerful than the state-of-the-art kernel CI tests, and 3) existing causal learning algorithms can infer much more causal directions by using the proposed method.
0 Replies

Loading