The best of both worlds: Improved outcome prediction using causal structure learning

27 Sept 2024 (modified: 28 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Outcome prediction, Causal structure learning, Personalised therapy
TL;DR: We make use of shared representations for causal structure learning and outcome prediction to improve the generalisation performance of outcome prediction, and also to weaken the assumptions of causal structure learning.
Abstract: In limited data settings as in the medical domain, causal structure learning can be a powerful tool for understanding the relationships between variables and achieving out-of-sample generalisation for the prediction of a specific target variable. Most methods that learn causal structure from observational data rely on strong assumptions, such as the absence of unmeasured confounders, that are not valid in real world scenarios. In addition, due to evolving conditions and treatment approaches, causal relationships between the variables change over time. Moreover in a clinical setting, symptoms often need to be managed before finding the root cause of a problem, which puts the emphasis on accurate outcome prediction. Consequently, prediction of a specific target variable from retrospective observational data based on causal relationships alone will not be sufficient for generalisation to prospective data. To overcome these limitations, we opt for the best of both worlds in this work by learning a shared representation between causal structure learning and outcome prediction. We provide extensive empirical evidence to show that this would not only facilitate out-of-sample generalisation in outcome prediction but also enhance robust causal discovery for the outcome variable. We also highlight the strengths of our model in terms of time efficiency and interpretability.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 10162
Loading