An Alternative Model for Mixtures of ExpertsDownload PDFOpen Website

1994 (modified: 11 Nov 2022)NIPS 1994Readers: Everyone
Abstract: We propose an alternative model for mixtures of experts which uses a different parametric form for the gating network. The modified model is trained by the EM algorithm. In comparison with earlier models-trained by either EM or gradient ascent-there is no need to select a learning stepsize. We report simulation experiments which show that the new architecture yields faster convergence. We also apply the new model to two problem domains: piecewise nonlinear function approximation and the combination of multiple previously trained classifiers.
0 Replies

Loading