Generating Natural Adversarial ExamplesDownload PDF

15 Feb 2018 (modified: 14 Oct 2024)ICLR 2018 Conference Blind SubmissionReaders: Everyone
Abstract: Due to their complex nature, it is hard to characterize the ways in which machine learning models can misbehave or be exploited when deployed. Recent work on adversarial examples, i.e. inputs with minor perturbations that result in substantially different model predictions, is helpful in evaluating the robustness of these models by exposing the adversarial scenarios where they fail. However, these malicious perturbations are often unnatural, not semantically meaningful, and not applicable to complicated domains such as language. In this paper, we propose a framework to generate natural and legible adversarial examples that lie on the data manifold, by searching in semantic space of dense and continuous data representation, utilizing the recent advances in generative adversarial networks. We present generated adversaries to demonstrate the potential of the proposed approach for black-box classifiers for a wide range of applications such as image classification, textual entailment, and machine translation. We include experiments to show that the generated adversaries are natural, legible to humans, and useful in evaluating and analyzing black-box classifiers.
TL;DR: We propose a framework to generate “natural” adversaries against black-box classifiers for both visual and textual domains, by doing the search for adversaries in the latent semantic space.
Keywords: adversarial examples, generative adversarial networks, interpretability, image classification, textual entailment, machine translation
Code: [![github](/images/github_icon.svg) zhengliz/natural-adversary](https://github.com/zhengliz/natural-adversary)
Data: [LSUN](https://paperswithcode.com/dataset/lsun), [MNIST](https://paperswithcode.com/dataset/mnist)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/generating-natural-adversarial-examples/code)
12 Replies

Loading