Unifying and Understanding Overparameterized Circuit Representations via Low-Rank Tensor Decompositions
Keywords: Probabilistic Circuits, Circuits, Low-Rank Tensor Decomposition
TL;DR: We disentangle the few structural choices that can characterize the model performance in terms of expressiveness and computational complexity
Abstract: Tensorizing probabilistic circuits (PCs) - structured computational graphs capable of efficiently and accurately performing various probabilistic reasoning tasks - is the go-to way to represent and learn these models. This paper systematically explores the architectural options employed in modern overparameterized PCs, namely RAT-SPNs, EiNets, and HCLTs, and unifies them into a single algorithmic framework. By trying to compress the existing overparameterized layers via low-rank decompositions, we discover alternative parameterizations that possess the same expressive power but are computationally more efficient. This emphasizes the possibility of “mixing & matching” different design choices to create new PCs and helps to disentangle the few ones that really matter.
Submission Number: 13