Keywords: Dirichlet process, Pitman-Yor process, Bayesian nonparametric mixture models, inconsistency
TL;DR: We show that in contrast to Dirichlet process mixture models, Pitman-Yor mixture models with a prior on the precision but fixed discount parameter are inconsistent for the number of clusters.
Abstract: Bayesian nonparametric (BNP) mixture models such as Dirichlet process (DP) and Pitman--Yor process (PY) mixture models are popular to model complex data. Their posterior distributions exhibit nice theoretical properties, converging at the optimal minimax rate to the true data-generating distribution, and extensive research has been devoted to developing this theory. However, consistency of the posterior distribution does not imply consistency of the number of clusters, and asymptotic guarantees for the posterior number of clusters of these BNP mixture models have been lacking until recently. Recent research has shown that these models can be inconsistent for the number of clusters. In the case of DP mixture models, this problem can be avoided when a prior is put on the model's concentration hyperparameter $\alpha$, as is common practice. In this work, we prove that PY mixture models remain inconsistent for the number of clusters when a prior is put on $\alpha$, in the special case where the true number of components in the data generating mechanism is equal to 1 and the discount parameter $\sigma \in (0,1)$ is a fixed constant.