- Keywords: Dirichlet process mixture, variational inference, evidence upper bound, sequential learning
- Abstract: Dirichlet process mixture model provides a flexible nonparametric framework for unsupervised learning. Monte Carlo based sampling methods always involve heavy computation efforts; conventional variational inference requires careful design of the variational distribution and the conditional expectation. In this work, we treat the DP mixture itself as the variational proposal, and view the given data as drawn samples of the unknown target distribution. We propose an evidence upper bound (EUBO) to act as the surrogate loss, and fit a DP mixture to the given data by minimizing the EUBO, which is equivalent to minimizing the KL-divergence between the target distribution and the DP mixture. We provide three advantages of the EUBO based DP mixture fitting and show how to build the black-box style sequential learning algorithm. We use the stochastic gradient descent (SGD) algorithm for optimization that leverages on the automatic differentiation tools. Simulation studies are provided to demonstrate the efficiency of our proposed methods.