Keywords: Mixtures of experts, Local differential privacy, PAC-Bayes, Generalization bounds
TL;DR: We present a new approach to mixtures of experts through local differential privacy, motivated by PAC-Bayes theory.
Abstract: We introduce a new approach to the mixture of experts model that consists in imposing local differential privacy on the gating mechanism. This is theoretically justified by statistical learning theory. Notably, we provide generalization bounds specifically tailored for mixtures of experts, leveraging the one-out-of-$n$ gating mechanism rather than the more common $n$-out-of-$n$ mechanism. Moreover, through experiments, we show that our approach improves the generalization ability of mixtures of experts.
Supplementary Material: zip
Primary Area: Learning theory
Submission Number: 10943
Loading