Keywords: Clustering, Mixture Model, Bayesian Optimization, EM
TL;DR: This paper proposes using Bayesian Optimization instead of EM for maximum likelihood estimation in finite mixtures of elliptical distributions.
Abstract: We address the problem of maximum likelihood estimation (MLE) for finite mixtures of elliptically distributed components, a setting that extends beyond the classical Gaussian mixture model. Standard approaches such as the Expectation–Maximization (EM) algorithm are widely used in practice but are known to suffer from local optima and typically require strong assumptions (e.g., Gaussianity) to guarantee convergence. In this work, we use the Bayesian Optimization (BO) framework for computing the MLE of general elliptical mixture models. We establish that the estimates obtained via BO converge to the true MLE, providing asymptotic *global* convergence guarantees, in contrast to EM. Furthermore, we show that, when the MLE is consistent, the clustering error rate achieved by BO converges to the optimal misclassification rate. Our results demonstrate that BO offers a practical, flexible, and theoretically sound alternative to EM for likelihood-based inference in mixture models, particularly in complex and/or non-Gaussian elliptical families where EM is difficult to implement and/or analyze. Experiments on synthetic and real data sets confirm the effectiveness and practical applicability of BO as an alternative to EM.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 9852
Loading