On Recovering from Modeling Errors Using Testing Bayesian NetworksDownload PDF

Published: 25 Jul 2021, Last Modified: 05 May 2023TPM 2021Readers: Everyone
Keywords: Bayesian Networks, State-space Abstraction, Supervised Learning
TL;DR: We address the problem of learning Bayesian Networks with missing edges or states. We propose a remedy for such modeling errors by using dynamic CPT parameters, and show that this remedy can be emulated efficiently using a Testing Bayesian Network.
Abstract: We consider the problem of supervised learning with Bayesian Networks when the used dependency structure is incomplete due to missing edges or missing variable states. These modeling errors induce independence constraints on the learned model that may not hold in the true, data-generating distribution. We provide a unified treatment of these modeling errors as instances of state-space abstractions. We then identify a class of Bayesian Networks and queries which allow one to fully recover from such modeling errors if one can choose Conditional Probability Tables (CPTs) dynamically based on evidence. We show theoretically that the recently proposed Testing Bayesian Networks (TBNs), which can be trained by compiling them into Testing Arithmetic Circuits (TACs), provide a promising construct for emulating this CPT selection mechanism. Finally, we present empirical results that illustrate the promise of TBNs as a tool for recovering from certain modeling errors in the context of supervised learning.
1 Reply

Loading