Keywords: DAG, Causal Discovery, Structural Learning
Abstract: Recovering the underlying Directed Acyclic Graph (DAG)
structures from observational data presents a formidable challenge, partly due
to the combinatorial nature of the DAG-constrained optimization
problem. Recently, researchers have identified gradient vanishing as
one of the primary obstacles in differentiable DAG learning and have
proposed several DAG constraints to mitigate this issue. By developing
the necessary theory to establish a connection between analytic
functions and DAG constraints, we demonstrate that analytic functions
from the set $\\{f(x) = c_0 + \\sum_{i=1}^{\infty}c_ix^i | \\forall i > 0, c_i > 0; r = \\lim_{i\\rightarrow \\infty}c_{i}/c_{i+1} > 0\\}$ can be employed to
formulate effective DAG constraints. Furthermore, we establish that
this set of functions is closed under several functional operators,
including differentiation, summation, and
multiplication. Consequently, these operators can be leveraged to
create novel DAG constraints based on existing ones. Using these
properties, we design a series of DAG constraints and develop an
efficient algorithm to evaluate them. Experiments
in various settings demonstrate that our DAG constraints
outperform previous state-of-the-art comparators. Our implementation is available at https://github.com/zzhang1987/AnalyticDAGLearning.
Primary Area: causal reasoning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5464
Loading