Invariance and Inverse Stability under ReLUDownload PDF

27 Sept 2018 (modified: 05 May 2023)ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: We flip the usual approach to study invariance and robustness of neural networks by considering the non-uniqueness and instability of the inverse mapping. We provide theoretical and numerical results on the inverse of ReLU-layers. First, we derive a necessary and sufficient condition on the existence of invariance that provides a geometric interpretation. Next, we move to robustness via analyzing local effects on the inverse. To conclude, we show how this reverse point of view not only provides insights into key effects, but also enables to view adversarial examples from different perspectives.
Keywords: deep neural networks, invertibility, invariance, robustness, ReLU networks
TL;DR: We analyze the invertibility of deep neural networks by studying preimages of ReLU-layers and the stability of the inverse.
Data: [CIFAR-10](https://paperswithcode.com/dataset/cifar-10)
12 Replies

Loading