Knowledge Intensive Learning of Cutset NetworksDownload PDF

Published: 08 May 2023, Last Modified: 26 Jun 2023UAI 2023Readers: Everyone
Keywords: Tractable Probabilistic Models, Cutset Networks, Parameter Learning, Structure Learning
TL;DR: We propose an algorithm to learn cutset networks from sparse and noisy data using qualitative influences
Abstract: Cutset networks (CNs) are interpretable probabilistic representations that combine probability trees and tree Bayesian networks, to model and reason about large multi-dimensional probability distributions. Motivated by high-stakes applications in domains such as healthcare where (a) rich domain knowledge in the form of qualitative influences is readily available and (b) use of interpretable models that the user can efficiently probe and infer over is often necessary, we focus on learning CNs in the presence of qualitative influences. We propose a penalized objective function that uses the influences as constraints, and develop a gradient-based learning algorithm, KICN. We show that because CNs are tractable, KICN is guaranteed to converge to a local maximum of the penalized objective function. Our experiments on several benchmark data sets show that our new algorithm is superior to the state-of-the-art, especially when the data is scarce or noisy.
Supplementary Material: pdf
Other Supplementary Material: zip
0 Replies

Loading