Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Deep Directed Generative Models with Energy-Based Probability Estimation
Taesup Kim, Yoshua Bengio
Feb 18, 2016 (modified: Feb 18, 2016)ICLR 2016 workshop submissionreaders: everyone
Abstract:Energy-based probabilistic models have been confronted with intractable computations during the learning that requires to have appropriate samples drawn from the estimated probability distribution. It can be approximately achieved by a Monte Carlo Markov Chain sampling process, but still has mixing problems especially with deep models that slow the learning. We introduce an auxiliary deep model that deterministically generates samples based on the estimated distribution, and this makes the learning easier without any high cost sampling process. As a result, we propose a new framework to train the energy-based probabilistic models with two separate deep feed-forward models. The one is only to estimate the energy function, and the another is to deterministically generate samples based on it. Consequently, we can estimate the probability distribution and its corresponding deterministic generator with deep models.
Enter your feedback below and we'll get back to you as soon as possible.