Demystifying Softmax Gating Function in Gaussian Mixture of Experts

Published: 21 Sept 2023, Last Modified: 14 Jan 2024NeurIPS 2023 spotlightEveryoneRevisionsBibTeX
Keywords: Mixture of Experts, Maximum Likelihood Estimation, Voronoi Loss Function, Algebraic Geometry.
Abstract: Understanding the parameter estimation of softmax gating Gaussian mixture of experts has remained a long-standing open problem in the literature. It is mainly due to three fundamental theoretical challenges associated with the softmax gating function: (i) the identifiability only up to the translation of parameters; (ii) the intrinsic interaction via partial differential equations between the softmax gating and the expert functions in the Gaussian density; (iii) the complex dependence between the numerator and denominator of the conditional density of softmax gating Gaussian mixture of experts. We resolve these challenges by proposing novel Voronoi loss functions among parameters and establishing the convergence rates of maximum likelihood estimator (MLE) for solving parameter estimation in these models. When the true number of experts is unknown and over-specified, our findings show a connection between the convergence rate of the MLE and a solvability problem of a system of polynomial equations.
Submission Number: 2400