Gradient based Causal Discovery with Diffusion Model

24 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Causal discovery, generative models
TL;DR: The paper proposes to use diffusion models for causal discovery, and searches for the DAG under continuous optimization frameworks.
Abstract: Causal discovery from observational data is an important problem in many applied sciences. Incorporating a recently proposed smooth characterization of acyclicity, gradient-based causal discovery approaches search for a Directed Acyclic Graph (DAG) by optimizing various neural models. Although they show some inspiring results given certain assumptions satisfied, their capability of modeling complex nonlinear causal generative functions is still unsatisfactory. Motivated by recent advances in deep generative models, we propose to use diffusion models for causal discovery, and search for the DAG under continuous optimization frameworks. The underlying nonlinear causal generative process is modeled with diffusion process, and with flexible parameter configurations, it has the ability to represent various functions, and the proposed causal discovery approach are able to generate graphs with satisfactory accuracy on observational data generated by either linear or nonlinear causal models. This is evidenced by empirical results on both synthetic and real data.
Primary Area: causal reasoning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3692
Loading