On the Lipschitz constant of random ReLU neural networks

Published: 25 Mar 2025, Last Modified: 20 May 2025SampTA 2025 InvitedTalkEveryoneRevisionsBibTeXCC BY 4.0
Session: Sampling and learning of deep neural networks (Philipp Petersen)
Keywords: Adversarial robustness, Lipschitz constant, random ReLU networks
TL;DR: We give sharp bounds for the Lipschitz constant of randomly initialized ReLU networks
Abstract: Despite their decisive success in many applications, trained deep neural networks are known to be vulnerable to so-called adversarial examples, meaning small (sometimes imperceptible) perturbations to the input that cause large adversarial perturbations to the network outputs. It is thus of great interest to study the robustness of neural networks, as measured for instance by their Lipschitz constant. In this talk, we will present several results in this direction, which together yield an almost sharp characterization of the Lipschitz constant of randomly initialized ReLU neural networks, both in expectation and with high probability. This is joint work with Sjoerd Dirksen (Utrecht University), Paul Geuchen (KU), Dominik Stoeger (KU), and Thomas Telaar (ex KU).
Submission Number: 6
Loading