Bounding the Robustness and Generalization for Individual Treatment Effect

24 Sept 2023 (modified: 02 Feb 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Individual Treatment Effect, Causal inference
Abstract: Individual treatment effect (ITE) estimation has important applications in fields such as healthcare, economics and education, hence attracted increasing attention from both research and industrial community. However, most existing models may not perform well in practice due to the lack of robustness of the ITE estimation predicted by deep neural networks when an imperceptible perturbation has been added to the covariate. To alleviate this problem, in this paper, we first derive an informative generalization bound that demonstrate the expected ITE estimation error is bounded by one of the most important term, the Lipschitz constant of ITE model. In addition, in order to use Integral Probability Metrics (IPM) to measure distances between distributions, we also obtain explicit bounds for the Wasserstein (WASS) and Maximum Mean Discrepancy (MMD) distances. More specifically, we propose two types of regularizations called Lipschitz Regularization and reproducing kernel Hilbert space (RKHS) Regularization for encouraging robustness in estimating ITE from observational data. Extensive experiments on both synthetic examples and standard benchmarks demonstrate our framework’s effectiveness and generality. To benefit this research direction, we release our project at https://github-rite.github.io/rite/.
Primary Area: causal reasoning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8972
Loading