Keywords: MoE, Buffer Overflow, attack
Abstract: Mixture of Experts (MoE) has become a key ingredient for scaling large foundation models while keeping inference costs steady.
We show that expert routing strategies that have cross-batch dependencies are vulnerable to attacks. Malicious queries can be sent to a model and can affect a model's output on other benign queries if they are grouped in the same batch.
We demonstrate this via a \emph{proof-of-concept} attack in a \emph{toy experimental setting}.
Submission Number: 208
Loading