Track: Full Paper
Abstract: Binarized Neural Networks (BNNs) have emerged as a sensible quantization method to reduce compute costs at inference time. As with other machine learning systems deployed in practice, they are susceptible to side-channel attacks that can be leveraged to reveal their internal characteristics and architecture. Previous work on side-channels in BNNs has been limited to the physical domain, requiring a powerful adversary with granular access to the system and advanced hardware tools. In this paper, we show how the inherent binary weight distribution of BNNs make them susceptible to timing attacks in a practical, software-based threat model. We achieve this by leveraging abnormal timing differences in subnormal number operations. Our contributions are two-fold; (a) we show how carefully crafted inputs can produce a time signal strong enough to reveal all the weights of an individual neuron and (b), we scale the attack to infer a fraction of the input layer of a BNN. We conclude assessing the challenges of BNN implementations in hopes that our findings will motivate safer deployments of BNNs.
Submission Number: 32
Loading