BayesDAG: Gradient-Based Posterior Sampling for Causal Discovery

Published: 19 Jun 2023, Last Modified: 28 Jul 20231st SPIGM @ ICML OralEveryoneRevisionsBibTeX
Keywords: Causal Discovery, Structure Learning, Bayesian Inference, Variational Inference, MCMC
TL;DR: We propose a Bayesian causal discovery method that samples DAGs from an augmented space with gradient-based MCMC which is applicable to both linear and nonlinear causal models, and offers accurate inference.
Abstract: Bayesian causal discovery aims to infer the posterior distribution over causal models from observed data, quantifying epistemic uncertainty and benefiting downstream tasks. However, computational challenges arise due to joint inference over combinatorial space of Directed Acyclic Graphs (DAGs) and nonlinear functions. In this work, we introduce a scalable Bayesian causal discovery framework based on stochastic gradient Markov Chain Monte Carlo (SG-MCMC) that directly samples DAGs from the posterior without any DAG regularization, simultaneously draws function parameter samples and is applicable to both linear and nonlinear causal models. To enable our approach, we derive a novel equivalence to the permutation-based DAG learning, which opens up possibilities of using any relaxed gradient estimator defined over permutations. To our knowledge, this is the first framework applying gradient-based MCMC sampling for causal discovery. Empirical evaluations on synthetic and real-world datasets demonstrate our approach's effectiveness compared to state-of-the-art baselines.
Submission Number: 43
Loading