Stochastic Gradient Estimate Variance in Contrastive Divergence and Persistent Contrastive DivergenceDownload PDF

20 Apr 2024 (modified: 23 Dec 2013)ICLR 2014 workshop submissionReaders: Everyone
Decision: submitted, no decision
Abstract: Contrastive Divergence (CD) and Persistent Contrastive Divergence (PCD) are popular methods for training the weights of Restricted Boltzmann Machines. However, both methods use an approximate method for sampling from the model distribution. As a side effect, these approximations yield significantly different variances for stochastic gradient estimates of individual samples. In this paper we show empirically that CD has a lower stochastic gradient estimate variance than exact sampling, while the sum of subsequent PCD estimates has a higher variance than exact sampling. The results give one explanation to the finding that CD can be used with smaller minibatches or higher learning rates than PCD.
4 Replies

Loading