Keywords: Neural Network Certification, Model Expressivity, Convex Relaxation
TL;DR: We show that layer-wise multi-neuron convex relaxation can provide complete certification, and this allows ReLU network under convex relaxation to express all continuous piecewise linear functions.
Abstract: Modern neural network certification methods heavily rely on convex relaxations to compute sound bounds. However, the true expressive power of convex relaxations is currently not well understood. Recent work has started investigating this direction, showing there does not exist a ReLU network that can express even the simple ``$\max$'' function in $\mathbb{R}^2$ such that the network outputs can be bounded exactly by single-neuron relaxations. This raises the following fundamental question: is there a convex relaxation (beyond single-neuron) that can provide exact bounds for ReLU networks expressing general continuous piecewise linear functions in $\mathbb{R}^n$? In this work, we investigate this question and prove, perhaps surprisingly, that layer-wise multi-neuron relaxations can compute exact bounds for general ReLU networks. Based on this novel result, we show that the expressivity of ReLU networks is no longer limited under multi-neuron relaxations. To the best of our knowledge, this is the first positive result on the completeness of convex relaxations and the expressivity of ReLU networks under convex relaxation, shedding light on the practice of certified robustness.
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6486
Loading