Causal Inference Despite Limited Global Confounding via Mixture ModelsDownload PDF

Published: 17 Mar 2023, Last Modified: 27 May 2023CLeaR 2023 PosterReaders: Everyone
Keywords: Mixture models, Bayesian networks, Causal DAGs, Hidden confounder, Population confounder, Global confounding, Causal Identifiability
TL;DR: The first known algorithm for identifying a mixture of Bayesian network distributions, hence recovering the joint distribution with a population/global confounder and making new causal relationships identifiable.
Abstract: A Bayesian Network is a directed acyclic graph (DAG) on a set of $n$ random variables (the vertices); a Bayesian Network Distribution (BND) is a probability distribution on the random variables that is Markovian on the graph. A finite $k$-mixture of such models is graphically represented by a larger graph which has an additional ``hidden'' (or ``latent'') random variable $U$, ranging in $\{1,\ldots,k\}$, and a directed edge from $U$ to every other vertex. Models of this type are fundamental to causal inference, where $U$ models an unobserved confounding effect of multiple populations, obscuring the causal relationships in the observable DAG. By solving the mixture problem and recovering the joint probability distribution with $U$, traditionally unidentifiable causal relationships become identifiable. Using a reduction to the more well-studied ``product'' case on empty graphs, we give the first algorithm to learn mixtures of non-empty DAGs.
0 Replies

Loading