Breaking the gridlock in Mixture-of-Experts: Consistent and Efficient AlgorithmsDownload PDFOpen Website

2019 (modified: 11 Nov 2022)ICML 2019Readers: Everyone
Abstract: Mixture-of-Experts (MoE) is a widely popular model for ensemble learning and is a basic building block of highly successful modern neural networks as well as a component in Gated Recurrent Units (G...
0 Replies

Loading