Natural Reweighted Wake-SleepDownload PDF

Published: 07 Nov 2020, Last Modified: 05 May 2023NeurIPSW 2020: DL-IG PosterReaders: Everyone
Keywords: Natural Gradient, Helmholtz Machines, Wake-Sleep Algorithm, Sigmoid Belief Networks, Generative Models
TL;DR: We present a geometric adaptation of the Reweigthed Wake-Sleep called the Natural Reweighted Wake-Sleep which uses the natural gradient to speed up the convergence.
Abstract: Natural gradient has been successfully employed in a wide range of optimization problems. However, for the training of neural networks the resulting increase in computational complexity sets a limitation to its practical application. Helmholtz Machines are a particular type of generative models, composed of two Sigmoid Belief Networks, commonly trained using the Wake-Sleep algorithm. The locality of the connections in this type of networks induces sparsity and a particular structure for the Fisher information matrix that can be exploited for the evaluation of its inverse, allowing the efficient computation of the natural gradient also for large networks. We introduce a novel algorithm called Natural Reweighted Wake-Sleep, a geometric adaptation of Reweighted Wake-Sleep, based on the computation of the natural gradient. We present an experimental analysis of the algorithm in terms of speed of convergence and the value of the log-likelihood, both with respect to number of iterations and training time, demonstrating improvements over non-geometric baselines.
4 Replies

Loading