On Learning Necessary and Sufficient Causal Graphs

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 spotlightEveryoneRevisionsBibTeX
Keywords: Causal structural learning, Necessity and sufficiency, Natural causal effects, Probabilities of causation, Variable selection
TL;DR: We aim to learn necessary and sufficient causal graphs containing only causally relevant variables for specific outcomes by linking probabilities of causation and natural causal effects.
Abstract: The causal revolution has stimulated interest in understanding complex relationships in various fields. Most of the existing methods aim to discover causal relationships among all variables within a complex large-scale graph. However, in practice, only a small subset of variables in the graph are relevant to the outcomes of interest. Consequently, causal estimation with the full causal graph---particularly given limited data---could lead to numerous *falsely discovered, spurious* variables that exhibit high correlation with, but exert no causal impact on, the target outcome. In this paper, we propose learning a class of *necessary and sufficient causal graphs (NSCG)* that exclusively comprises causally relevant variables for an outcome of interest, which we term *causal features*. The key idea is to employ *probabilities of causation* to systematically evaluate the importance of features in the causal graph, allowing us to identify a subgraph relevant to the outcome of interest. To learn NSCG from data, we develop a *necessary and sufficient causal structural learning (NSCSL)* algorithm, by establishing theoretical properties and relationships between probabilities of causation and natural causal effects of features. Across empirical studies of simulated and real data, we demonstrate that NSCSL outperforms existing algorithms and can reveal crucial yeast genes for target heritable traits of interest.
Supplementary Material: zip
Submission Number: 9312
Loading