Abstract: The real world naturally has dimensions of time and space. Therefore, estimating the counterfactual outcomes with spatial-temporal attributes is a crucial problem. However, previous methods are based on classical statistical models, which still have limitations in performance and generalization. This paper proposes a novel framework for estimating counterfactual outcomes with spatial-temporal attributes using the Transformer, exhibiting stronger estimation ability. Under mild assumptions, the proposed estimator within this framework is consistent and asymptotically normal. To validate the effectiveness of our approach, we conduct simulation experiments and real data experiments. Simulation experiments show that our estimator has a stronger estimation capability than baseline methods. Real data experiments provide a valuable conclusion to the causal effect of conflicts on forest loss in Colombia. The source code is available at this [URL](https://github.com/lihe-maxsize/DeppSTCI_Release_Version-master).
Lay Summary: Understanding what would have happened in a different situation — known as estimating counterfactual outcomes — is key for answering many real-world questions, such as how a policy might affect the environment. But when these events happen across both space and time, existing methods struggle to give accurate answers. In this work, we propose a new approach to estimate counterfactual outcomes that vary across both space and time, using Transformer models — a powerful and general deep learning architecture known for capturing complex patterns. We evaluate our method on both simulated data and real-world events in Colombia, where we study how violent conflicts affect forest loss. Our model performs better than existing methods and offers new insights into the environmental consequences of conflict.
Link To Code: https://github.com/lihe-maxsize/DeppSTCI_Release_Version-master
Primary Area: General Machine Learning->Causality
Keywords: Counterfactual Outcomes Estimation, Potential Outcome Framework, Spatial-Temporal Data
Submission Number: 6300
Loading