Keywords: Causal inference, Information theory
TL;DR: We quantifying the trade-off between the strength of the unobserved confounders measured by its entropy and the bounds on causal effect and formed an optimization problem to get tighter bounds with entropy constraint.
Abstract: In this paper, we analyze the effect of “weak confounding” on causal estimands. More specifically, under the assumption that the unobserved confounders that render a query non-identifiable have small entropy, we propose an efficient linear program to derive the upper and lower bounds of the causal effect. We show that our bounds are consistent in the sense that as the entropy of unobserved confounders goes to zero, the gap between the upper and lower bound vanishes. Finally, we conduct synthetic and real data simulations to compare our bounds with the bounds obtained by the existing work that cannot incorporate such entropy constraints and show that our bounds are tighter for the setting with weak confounders.
Submission Number: 32
Loading