Matrix Product Operator Restricted Boltzmann MachinesDownload PDF

19 Oct 2018 (modified: 05 May 2023)NIPS 2018 Workshop CDNNRIA Blind SubmissionReaders: Everyone
Abstract: A restricted Boltzmann machine (RBM) learns a probabilistic distribution over its input samples and has numerous uses like dimensionality reduction, classification and generative modeling. Conventional RBMs accept vectorized data that dismisses potentially important structural information in the original tensor (multi-way) input. Matrix-variate and tensor-variate RBMs, named MvRBM and TvRBM, have been proposed but are all restrictive by construction. This work presents the matrix product operator RBM (MPORBM) that utilizes a tensor network generalization of Mv/TvRBM, preserves input formats in both the visible and hidden layers, and results in higher expressive power. A novel training algorithm integrating contrastive divergence and an alternating optimization procedure is also developed.
TL;DR: Propose a general tensor-based RBM model which can compress the model greatly at the same keep a strong model expression capacity
Keywords: restricted Boltzmann machines, generative model, model compression
4 Replies

Loading