NODEAttack: Adversarial Attack on the Energy Consumption of Neural ODEsDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: Adversarial Machine Learning, Energy Consumption
Abstract: Recently, Neural ODE (Ordinary Differential Equation) models have been proposed, which use ordinary differential equation solving to predict the output of neural network. Due to the low memory usage, Neural ODE models can be considered as an alternative that can be deployed in resource-constrained devices (e.g., IoT devices, mobile devices). However, to deploy a Deep Learning model in resource-constrained devices, low inference energy cost is also required along with low memory cost. Unlike the memory cost, the energy consumption of the Neural ODEs during inference can be adaptive because of the adaptive nature of the ODE solvers. Attackers can leverage the adaptive behaviour of Neural ODEs to attack the energy consumption of Neural ODEs. However, energy-based attack scenarios have not been explored against Neural ODEs. To show the vulnerability of Neural ODEs against adversarial energy-based attack, we propose NODEAttack. The objective of NODEAttack is to generate adversarial inputs that require more ODE solvers computations, therefore increasing neural ODEs inference-time energy consumption. Our extensive evaluation on two datasets and two popular ODE solvers show that the samples generated through NODEAttack can increase up to 168% energy consumption than average energy consumption of benign test data during inference time. Our evaluation also shows the attack transferability is feasible across solvers and architectures. Also, we perform a case study showing the impact of the generated adversarial examples, which shows that NODEAttack generated adversarial examples can decrease 50% efficiency of an object-recognition-based mobile application.
One-sentence Summary: The paper proposes adversarial energy attack on Neural ODEs.
Supplementary Material: zip
11 Replies

Loading