Revamp: Automated Simulations of Adversarial Attacks on Arbitrary Objects in Realistic Scenes

Published: 19 Mar 2024, Last Modified: 15 May 2024Tiny Papers @ ICLR 2024 NotableEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Robustness, Physically Realizable, Adversarial Attack, Differentiable Rendering, Scenario Creation
TL;DR: Automated simulations of adversarial attacks on arbitrary objects in realistic scenes using differentiable rendering.
Abstract: Deep learning models, such as those used in autonomous vehicles are vulnerable to adversarial attacks where attackers could place adversarial objects in the environment to induce incorrect detections. While generating such adversarial objects in the digital realm is well-studied, successfully transferring these attacks to the physical realm remains challenging, especially when accounting for real-world environmental factors. We address these challenges with REVAMP, a first-of-its-kind Python library for creating attack scenarios with arbitrary objects in scenes with realistic environmental factors, lighting, reflection, and refraction. REVAMP empowers researchers and practitioners to swiftly explore diverse scenarios, offering a wide range of configurable options for experiment design and using differentiable rendering to replicate physically-plausible adversarial objects. REVAMP is open-source and available at https://github.com/poloclub/revamp and a demo video is available at https://youtu.be/NA0XR0XkS1E.
Submission Number: 144
Loading