Structural Learning of Probabilistic Sentential Decision Diagrams under Partial Closed-World Assumption
Keywords: PSDD, structural learning, closed-world assumption
TL;DR: We present a new structural learning algorithm for PSDDs based on a relaxation of the classical closed-world assumption for databases
Abstract: Probabilistic sentential decision diagrams are a class of structured-decomposable probabilistic circuits especially designed to embed logical constraints. To adapt the classical LearnSPN scheme to learn the structure of these models, we propose a new scheme based on a partial closed-world assumption: data implicitly provide the logical base of the circuit. Sum nodes are thus learned by recursively clustering batches in the initial data base, while the partitioning of the variables obeys a given input vtree. Preliminary experiments show that the proposed approach might properly fit training data, and generalize well to test data, provided that these remain consistent with the underlying local base, that is a relaxation of the training data base.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/structural-learning-of-probabilistic/code)
1 Reply
Loading