Keywords: Causal Inference, Transformer, Propensity Score, Self-Supervised Learning
Abstract: We introduce the Propensity Similarity guided Bidirectional Transformer (PSBT), a novel framework designed to estimate causal effects in observational data while addressing confounding bias. PSBT employs a pre-training and fine-tuning approach to learn causal representations, guided by propensity scores. In the pre-training phase, the model predicts masked covariates (self-supervised learning) and propensity similarity between unit pairs (weakly supervised learning), enabling the representation space to disentangle confounding factors. The fine-tuning stage leverages these representations for causal outcome prediction, refining them for counterfactual reasoning.
Experiments on multiple benchmark datasets demonstrate that PSBT significantly outperforms traditional and state-of-the-art causal inference methods in estimating the Conditional Average Treatment Effect (CATE) and other metrics. By emphasizing propensity-guided learning over conventional balancing techniques, PSBT achieves robust and interpretable representations, advancing deep learning model capabilities in causal effect inference tasks.
Primary Area: causal reasoning
Submission Number: 7322
Loading