JOINT STOCHASTIC APPROXIMATION LEARNING OF HELMHOLTZ MACHINESDownload PDF

18 Apr 2024 (modified: 18 Feb 2016)ICLR 2016 workshop submissionReaders: Everyone
Abstract: Though with progress, model learning and performing posterior inference still re- mains a common challenge for using deep generative models, especially for han- dling discrete hidden variables. This paper is mainly concerned with algorithms for learning Helmholz machines, which is characterized by pairing the genera- tive model with an auxiliary inference model. A common drawback of previous learning algorithms is that they indirectly optimize some bounds of the targeted marginal log-likelihood. In contrast, we successfully develop a new class of al- gorithms, based on stochastic approximation (SA) theory of the Robbins-Monro type, to directly optimize the marginal log-likelihood and simultaneously mini- mize the inclusive KL-divergence. The resulting learning algorithm is thus called joint SA (JSA). Moreover, we construct an effective MCMC operator for JSA. Our results on the MNIST datasets demonstrate that the JSA’s performance is consis- tently superior to that of competing algorithms like RWS, for learning a range of difficult models.
Conflicts: tsinghua.edu
2 Replies

Loading