CausalPFN: Amortized Causal Effect Estimation via In-Context Learning

Published: 09 Jun 2025, Last Modified: 13 Jul 2025ICML 2025 Workshop SIM OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Causal Inference, Prior-Fitted Networks, Posterior Predictive Distributions, Ignorability, In-Context Learning, Meta-Learning
TL;DR: CausalPFN is a pre-trained transformer that amortizes causal effect estimation: trained once on simulated data-generating processes, it outputs calibrated effects for new observational datasets with zero tuning.
Abstract: Causal effect estimation from observational data is fundamental in various applications. However, selecting an appropriate estimator from dozens of specialized methods demands substantial manual effort and domain expertise. We present CausalPFN, a single transformer that amortizes this workflow: trained once on a large library of simulated data-generating processes that satisfy ignorability, it infers causal effects for new observational datasets out-of-the-box. CausalPFN combines ideas from Bayesian causal inference with the large-scale training protocol of prior-fitted networks (PFNs), learning to map raw observations directly to causal-effects without any task-specific adjustment. Our approach achieves superior average performance on heterogeneous and average treatment effect estimation benchmarks (IHDP, Lalonde, ACIC). This ready-to-use model does not require any further training or tuning and takes a step toward automated causal inference.
Submission Number: 32
Loading