Bit-Flip Induced Latency Attacks in Object Detection

Published: 01 Jan 2025, Last Modified: 19 May 2025WACV 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Deep learning and computer vision have experienced significant advancements, particularly in critical applications such as autonomous driving and real-time surveillance, where object detection (OD) plays a pivotal role. Ensuring the accuracy and speed of these systems is paramount to prevent accidents or failures. Recently, latency-based attacks have emerged as a new threat, driven by the essential need for real-time performance in various applications. These attacks target model responsiveness to disrupt system performance without necessarily compromising accuracy. Our preliminary experiments show that introducing just a few bit flips to key parameters in OD models can significantly increase latency, degrading performance. Meanwhile, recent advancements in memory-based attacks, such as Row Hammer [18], demonstrate the ability to conveniently introduce bit flips at desired locations without physical hardware interaction. Based on the observations, we propose a novel attack on OD models that leverages row-hammer to introduce bit-flips via side channels, targeting the non-maximum suppression (NMS) filter and significantly increasing latency. Unlike previous methods that modify input data, our technique ensures efficiency by minimizing bit-flips through critical path exploitation and achieves practical applicability with only a subset of validation data. Experiments across various datasets and models validate our approach, demonstrating latency increases up to 71.6 ms (20.4×) with just 31 bit-flips.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview