Keywords: Out-of-distribution Detection, Efficiency
TL;DR: We propose a technique to efficiently detect out-of-distribution inputs using skipping mechanism in dynamic neural networks.
Abstract: Out-of-distribution (OOD) detection is crucial for mitigating adversarial attacks, ensuring model robustness, and maintaining the safety of AI systems. Detecting OOD samples as early as possible is particularly important in the context of large models, as it can minimize the attack surface, provide early warnings, and optimize computational resources. In this work, we focus on detecting OOD inputs in a partial inference setting and investigate whether the skipping mechanism used in dynamic neural networks (DyNNs) can be leveraged for early OOD detection. We first establish that the feature maps at various DyNN gates can help identify anomalies. Building on this, we propose SkipOOD, a lightweight OOD detector that uses an uncertainty scoring function and an exit detector at each gate to robustly identify OODs as early as possible. Through extensive evaluation, we demonstrate that SkipOOD achieves competitive performance in detecting OOD samples while reducing resource usage by nearly 50%.
Submission Number: 25
Loading