Differentiable Structure Learning with Ancestral Constraints

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: This work extends the differentiable causal discovery framework able to integrate ancestral constraints.
Abstract: Differentiable structure learning of causal directed acyclic graphs (DAGs) is an emerging field in causal discovery, leveraging powerful neural learners. However, the incorporation of ancestral constraints, essential for representing abstract prior causal knowledge, remains an open research challenge. This paper addresses this gap by introducing a generalized framework for integrating ancestral constraints. Specifically, we identify two key issues: the non-equivalence of relaxed characterizations for representing path existence and order violations among paths during optimization. In response, we propose a binary-masked characterization method and an order-guided optimization strategy, tailored to address these challenges. We provide theoretical justification for the correctness of our approach, complemented by experimental evaluations on both synthetic and real-world datasets.
Lay Summary: Understanding cause-and-effect relationships from data, known as causal discovery, is an important goal in many scientific fields. Recent machine learning methods can uncover complex causal patterns but struggle to incorporate the kind of general, high-level knowledge that researchers often have, such as knowing that one variable affects another indirectly. We address this limitation by introducing a new framework that allows such broad, qualitative knowledge to guide the learning process. By integrating information about whether a causal connection (even indirect) should exist or not, our approach makes machine learning-based causal discovery more accurate and aligned with expert understanding.
Primary Area: General Machine Learning->Causality
Keywords: Causal discovery, Continuous DAG structure learning, Constrained optimization problem
Submission Number: 1110
Loading