Learning Discrete Directed Acyclic Graphs via BackpropagationDownload PDF

03 Oct 2022, 16:46 (modified: 10 Nov 2022, 21:17)CML4ImpactReaders: Everyone
Keywords: structure, learning, directed, acyclic, graphs
TL;DR: We show that probabilistic backpropagation methods which retain the fully discrete nature of a directed acyclic graph (DAG) can be used to predict DAGs from data.
Abstract: Recently continuous relaxations have been proposed in order to learn Directed Acyclic Graphs (DAGs) from data by backpropagation, instead of using combinatorial optimization. However, a number of techniques for fully discrete backpropagation could instead be applied. In this paper, we explore that direction and propose DAG-DB, a framework for learning DAGs by Discrete Backpropagation. Based on the architecture of Implicit Maximum Likelihood Estimation (I-MLE), DAG-DB adopts a probabilistic approach to the problem, sampling binary adjacency matrices from an implicit probability distribution. DAG-DB learns a parameter for the distribution from the loss incurred by each sample, performing competitively using either of two fully discrete backpropagation techniques, namely I-MLE and Straight-Through Estimation.
0 Replies