Distribution Learning Meets Graph Structure Sampling

Published: 18 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Distribution learning, Bayesian networks, Chordal graphs, Online learning
TL;DR: We develop efficient algorithms for non-uniformly sampling over directed acyclic graph structures, and use these along with results from online learning, to develop efficient algorithms for agnostically-learning Bayes nets in KL divergence.
Abstract: This work establishes a novel link between the problem of PAC-learning high-dimensional graphical models and the task of (efficient) counting and sampling of graph structures, using an online learning framework. The problem of efficiently counting and sampling graphical structures, such as spanning trees and acyclic orientations, has been a vibrant area of research in algorithms. We show that this rich algorithmic foundation can be leveraged to develop new algorithms for learning high-dimensional graphical models. We present the first efficient algorithm for (both realizable and agnostic) learning of Bayes nets with a chordal skeleton. In particular, we present an algorithm that, given integers $k,d > 0$, error parameter $\varepsilon > 0$, an undirected chordal graph $G$ on $n$ vertices, and sample access to a distribution $P^\ast$ on $[k]^n$; (1) returns a Bayes net $\widehat{P}$ with skeleton $G$ and indegree $d$, whose KL-divergence from $P^\ast$ is at most $\varepsilon$ more than the optimal KL-divergence between $P^\ast$ and any Bayes net with skeleton $G$ and indegree $d$, (2) uses $\widetilde{O}(n^3k^{d+1}/\varepsilon^2)$ samples from $P^\ast$ and runs in time $\mathrm{poly}(n,k,\varepsilon^{-1})$ for constant $d$. Prior results in this spirit were for only for trees ($d=1$, tree skeleton) via Chow-Liu, and in the realizable setting for polytrees (arbitrary $d$ but tree skeleton). Thus, our result significantly extends the state-of-the-art in learning Bayes net distributions. We also establish new results for learning tree and polytree distributions.
Supplementary Material: zip
Primary Area: General machine learning (supervised, unsupervised, online, active, etc.)
Submission Number: 20438
Loading