Causal Reasoning in the Presence of Latent Confounders via Neural ADMG LearningDownload PDF

03 Oct 2022 (modified: 05 May 2023)CML4ImpactReaders: Everyone
Keywords: causality, causal discovery, causal inference, structural equation model, latent confounders, variational inference
TL;DR: A new method for causal discovery and inference under the presence of latent confounders
Abstract: Latent confounding has been a long-standing obstacle for causal reasoning from observational data. One popular approach is to model the data using acyclic directed mixed graphs (ADMGs), which describe ancestral relations between variables using directed and bidirected edges. However, existing methods using ADMGs are based on either linear functional assumptions or a discrete search that is complicated to use and lacks computational tractability for large datasets. In this work, we further extend the existing body of work and develop a novel gradient-based approach to learning an ADMG with nonlinear functional relations from observational data. We first show that the presence of latent confounding is identifiable under the assumptions of bow-free ADMGs with nonlinear additive noise models. With this insight, we propose a novel neural causal model based on autoregressive flows. This not only enables us to model complex causal relationships behind the data, but also estimate their functional relationships (hence treatment effects) simultaneously. We further validate our approach via experiments on both synthetic and real-world datasets, and demonstrate the competitive performance against relevant baselines.
0 Replies

Loading