A Conditional Independence Test in the Presence of Discretization

09 May 2024 (modified: 06 Nov 2024)Submitted to NeurIPS 2024EveryoneRevisionsBibTeXCC BY 4.0
Keywords: conditional independence test, causal discovery, discretization
TL;DR: We introduce a conditional independence test to correct the misjudgement of conditional independence casued by discretization
Abstract: Testing conditional independence has many important applications, such as Bayesian network learning and causal discovery. Although several approaches have been developed for learning conditional independence structures for observed variables, those existing methods generally fail to work when the variables of interest can not be directly observed and only discretized values of those variables are available. For example, if $X_1$, $\tilde{X}_2$ and $X_3$ are the observed variables, where $\tilde{X}_2$ is a discretization of the latent variable $X_2$, applying the existing methods to the observations of $X_1$, $\tilde{X}_2$ and $X_3$ would lead to a false conclusion about the underlying conditional independence of variables $X_1$, $X_2$ and $X_3$. Motivated by this, we propose a conditional independence test specifically designed to accommodate the presence of discretization. % To achieve this, we design the bridge equations to estimate the underlying conditional independence. To achieve this, a bridge function and nodewise regression are used to recover the precision coefficients reflecting the conditional dependence of the latent continuous variables under the nonparanormal model. An appropriate test statistic has been proposed and its asymptotic distribution under the null hypothesis of conditional independence has been derived. Both theoretical results and empirical validation have been provided, demonstrating the effectiveness of our testing methods.
Supplementary Material: zip
Primary Area: Causal inference
Submission Number: 3449
Loading