Expressive power of tensor-network factorizations for probabilistic modelingDownload PDF

Ivan Glasser, Ryan Sweke, Nicola Pancotti, Jens Eisert, Ignacio Cirac

06 Sept 2019 (modified: 05 May 2023)NeurIPS 2019Readers: Everyone
Abstract: Tensor-network techniques have recently proven useful as a tool for both the rigorous analysis of existing learning algorithms, and the formulation of new methods. Inspired by these developments, and the natural correspondence between tensor networks and probabilistic graphical models, we provide a rigorous analysis of the expressive power of various tensor-train/MPS based factorizations of discrete multivariate probability distributions. These factorizations, which generalize both hidden Markov models and the probabilistic interpretation of local quantum circuits, exhibit tractable likelihoods and efficient learning algorithms. Interestingly, we prove that there exist unbounded separations between the resource requirements of some of these tensor networks. Additionally, we prove that using complex instead of real tensors can lead to an arbitrarily large reduction in the number of parameters of the network, and that there exists a specific factorization with provably better expressivity than all other representations considered.
CMT Num: 845
Code Link: https://github.com/glivan/tensor_networks_for_probabilistic_modeling
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview