Branch and Bound for Sigmoid-Like Neural Network Verification

Published: 01 Jan 2023, Last Modified: 25 Jan 2025ICFEM 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The robustness of deep neural networks has received extensive attention and is considered to need guarantees by formal verification. For ReLU neural network verification, there are abundant studies and various techniques. However, verifying sigmoid-like neural networks still relies on linear approximation, which inevitably introduces errors and leads to imprecise results. To reduce error and get better results, we present a branch and bound framework for sigmoid-like neural network verification in this paper. In this framework, we design a neuron splitting method and a branching strategy. The splitting method can split neurons with non-linear sigmoid-like activation functions, and the branching strategy reduces the size of the branch and bound tree, which improves the verification performance. We implement our verification framework as SigBaB and evaluate its performance on open source benchmarks. Experiment results show that our method can produce more precise verification results than other state-of-the-art methods and our branching strategy shows superior performance compared to other strategies.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview