Learning Discrete Directed Acyclic Graphs Via BackpropagationDownload PDF

Published: 21 Oct 2022, Last Modified: 21 Apr 2024nCSI WS @ NeurIPS 2022 PosterReaders: Everyone
Keywords: structure, learning, directed, acyclic, graphs
TL;DR: We show that probabilistic backpropagation methods which retain the fully discrete nature of a directed acyclic graph (DAG) can be used to predict DAGs from data.
Abstract: Recently continuous relaxations have been proposed in order to learn Directed Acyclic Graphs (DAGs) from data by backpropagation, instead of using combinatorial optimization. However, a number of techniques for fully discrete backpropagation could instead be applied. In this paper, we explore that direction and propose DAG-DB, a framework for learning DAGs by Discrete Backpropagation. Based on the architecture of Implicit Maximum Likelihood Estimation (I-MLE), DAG-DB adopts a probabilistic approach to the problem, sampling binary adjacency matrices from an implicit probability distribution. DAG-DB learns a parameter for the distribution from the loss incurred by each sample, performing competitively using either of two fully discrete backpropagation techniques, namely I-MLE and Straight-Through Estimation.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 4 code implementations](https://www.catalyzex.com/paper/arxiv:2210.15353/code)
4 Replies

Loading