Estimating Neural Network Robustness via Lipschitz Constant and Architecture Sensitivity

Published: 22 Oct 2024, Last Modified: 06 Nov 2024CoRL 2024 Workshop SAFE-ROL SpotlightPosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Robustness, Lipschitz Continuity, Robot Learning, Neural Networks
TL;DR: We introduce an analytical expression for calculating the Lipschitz constant based on neural network architectures, providing a theoretical framework for assessing and improving network robustness.
Abstract: Ensuring neural network robustness is essential for the safe and reliable operation of robotic learning systems, especially in perception and decision-making tasks within real-world environments. This paper investigates the robustness of neural networks in perception systems, specifically examining their sensitivity to targeted, small-scale perturbations. We identify the Lipschitz constant as a key metric for quantifying and enhancing network robustness. We derive an analytical expression to compute the Lipschitz constant based on neural network architecture, providing a theoretical basis for estimating and improving robustness. Several experiments reveal the relationship between network design, the Lipschitz constant, and robustness, offering practical insights for developing safer, more robust robot learning systems.
Submission Number: 16
Loading