Active Bayesian Causal InferenceDownload PDF

Published: 21 Oct 2022, Last Modified: 08 Sept 2024nCSI WS @ NeurIPS 2022 PosterReaders: Everyone
Keywords: Bayesian methods, causal inference, causal discovery, causal reasoning, active learning, experimental design, probabilistic machine learning, Gaussian processes
TL;DR: We propose Active Bayesian Causal Inference (ABCI), a fully Bayesian active learning framework for integrated causal discovery and reasoning with experimental design.
Abstract: Causal discovery and causal reasoning are classically treated as separate and consecutive tasks: one first infers the causal graph, and then uses it to estimate causal effects of interventions. However, such a two-stage approach is uneconomical, especially in terms of actively collected interventional data, since the causal query of interest may not require a fully-specified causal model. From a Bayesian perspective, it is natural to treat a causal query (e.g., the causal graph or some causal effect) as subject to posterior inference while other unobserved quantities ought to be marginalized out. In this work, we propose Active Bayesian Causal Inference (ABCI), a fully-Bayesian active learning framework for integrated causal discovery and reasoning, which jointly infers a posterior over causal models and queries of interest. ABCI sequentially designs experiments that are maximally informative about the target causal query, collects the corresponding interventional data, and updates the Bayesian beliefs to choose the next experiment. Through simulations, we demonstrate that our approach is more data-efficient than several baselines that only focus on learning the full causal graph. This allows us to accurately learn downstream causal queries from fewer samples while providing well-calibrated uncertainty estimates for the quantities of interest.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/active-bayesian-causal-inference/code)
4 Replies

Loading