Abstract: Tensor-network techniques have recently proven useful as a tool for both the rigorous analysis of existing learning algorithms, and the formulation of new methods. Inspired by these developments, and the natural correspondence between tensor networks and probabilistic graphical models, we provide a rigorous analysis of the expressive power of various tensor-train/MPS based factorizations of discrete multivariate probability distributions. These factorizations, which generalize both hidden Markov models and the probabilistic interpretation of local quantum circuits, exhibit tractable likelihoods and efficient learning algorithms. Interestingly, we prove that there exist unbounded separations between the resource requirements of some of these tensor networks. Additionally, we prove that using complex instead of real tensors can lead to an arbitrarily large reduction in the number of parameters of the network, and that there exists a specific factorization with provably better expressivity than all other representations considered.
CMT Num: 845
Code Link: https://github.com/glivan/tensor_networks_for_probabilistic_modeling
0 Replies
Loading