Boosting Generative Models by Leveraging Cascaded Meta-Models

Sep 25, 2019 ICLR 2020 Conference Withdrawn Submission readers: everyone
  • Keywords: Probabilistic Machine Learning, Learning Generative Models, Unsupervised Learning
  • TL;DR: Propose an approach for boosting generative models by cascading hidden variable models
  • Abstract: A deep generative model is a powerful method of learning a data distribution, which has achieved tremendous success in numerous scenarios. However, it is nontrivial for a single generative model to faithfully capture the distributions of the complex data such as images with complicate structures. In this paper, we propose a novel approach of cascaded boosting for boosting generative models, where meta-models (i.e., weak learners) are cascaded together to produce a stronger model. Any hidden variable meta-model can be leveraged as long as it can support the likelihood evaluation. We derive a decomposable variational lower bound of the boosted model, which allows each meta-model to be trained separately and greedily. We can further improve the learning power of the generative models by combing our cascaded boosting framework with the multiplicative boosting framework.
  • Code: https://github.com/c-bgm/cascaded_boosting
0 Replies

Loading