Generating Realistic Physical Adversarial Examplesby Patch Transformer NetworkDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Adversarial attack, Physical Adversarial Examples, object detection
Abstract: Physical adversarial attacks apply carefully crafted adversarial perturbations onto real objects to maliciously alter the prediction of object classifiers or detectors. The current standard method for designing physical adversarial patches, i.e. Expectation over Transformations (EoT), simulates real-world environments by random physical transformations, resulting in adversarial examples far from satisfactory. To tackle this issue, we propose and develop a novel network to learn real-world physical transformations from data, including geometric transformation, printer color transformation and illumination adaption. Our approach produces realisticlooking adversarial examples and can be integrated into existing attack generation frameworks to generate adversarial patches effectively. We apply our approach to design adversarial T-shirts worn by moving people, one of the most challenging settings for physical attacks. Experiments show that our approach significantly outperforms the state of the arts when attacking DL-based object detectors in real life. Moreover, we build a first-kind-of adversarial T-shirts dataset to enable effective training of our approach and facilitate fair comparison on physical world attacks by considering a standard patch size, environment changes and object variances. Our code will be made publicly available.
One-sentence Summary: A framework of learning physical transformations from data for designing physical adversarial patches
Supplementary Material: zip
5 Replies

Loading