Complete Verification via Multi-Neuron Relaxation Guided Branch-and-BoundDownload PDF

29 Sept 2021, 00:35 (edited 15 Mar 2022)ICLR 2022 PosterReaders: Everyone
  • Keywords: Certified Robustness, Branch-and-Bound, Convex Relaxation
  • Abstract: State-of-the-art neural network verifiers are fundamentally based on one of two paradigms: either encoding the whole verification problem via tight multi-neuron convex relaxations or applying a Branch-and-Bound (BaB) procedure leveraging imprecise but fast bounding methods on a large number of easier subproblems. The former can capture complex multi-neuron dependencies but sacrifices completeness due to the inherent limitations of convex relaxations. The latter enables complete verification but becomes increasingly ineffective on larger and more challenging networks. In this work, we present a novel complete verifier which combines the strengths of both paradigms: it leverages multi-neuron relaxations to drastically reduce the number of subproblems generated during the BaB process and an efficient GPU-based dual optimizer to solve the remaining ones. An extensive evaluation demonstrates that our verifier achieves a new state-of-the-art on both established benchmarks as well as networks with significantly higher accuracy than previously considered. The latter result (up to 28% certification gains) indicates meaningful progress towards creating verifiers that can handle practically relevant networks.
  • One-sentence Summary: We obtain a state-of-the-art GPU-based neural network verifier by leveraging tight multi-neuron constraints in a Branch-and-Bound setting.
21 Replies