Knowledge Intensive Learning of Cutset Networks

Published: 13 Jul 2023, Last Modified: 22 Aug 2023TPM 2023EveryoneRevisionsBibTeX
Keywords: Tractable Probabilistic Models, Cutset Networks, Parameter Learning, Structure Learning
TL;DR: We propose an algorithm to learn cutset networks from sparse and noisy data using qualitative influences
Abstract: Cutset networks (CNs) are interpretable probabilistic representations that combine probability trees and tree Bayesian networks, to model and reason about large multi-dimensional probability distributions. Motivated by high-stakes applications in domains such as healthcare where (a) rich domain knowledge in the form of qualitative influences is readily available and (b) use of interpretable models that the user can efficiently probe and infer over is often necessary, we focus on learning CNs in the presence of qualitative influences. We propose a penalized objective function that uses the influences as constraints, and develop a gradient-based learning algorithm, KICN. We show that because CNs are tractable, KICN is guaranteed to converge to a local maximum of the penalized objective function. Our experiments on several benchmark data sets show that our new algorithm is superior to the state-of-the-art, especially when the data is scarce or noisy.
Submission Number: 17
Loading