Partition-Based Formulations for Mixed-Integer Optimization of Trained ReLU Neural NetworksDownload PDF

21 May 2021, 20:43 (modified: 25 Oct 2021, 12:35)NeurIPS 2021 PosterReaders: Everyone
Keywords: Mixed-integer programming, Deep learning, Rectified linear activation
Abstract: This paper introduces a class of mixed-integer formulations for trained ReLU neural networks. The approach balances model size and tightness by partitioning node inputs into a number of groups and forming the convex hull over the partitions via disjunctive programming. At one extreme, one partition per input recovers the convex hull of a node, i.e., the tightest possible formulation for each node. For fewer partitions, we develop smaller relaxations that approximate the convex hull, and show that they outperform existing formulations. Specifically, we propose strategies for partitioning variables based on theoretical motivations and validate these strategies using extensive computational experiments. Furthermore, the proposed scheme complements known algorithmic approaches, e.g., optimization-based bound tightening captures dependencies within a partition.
Supplementary Material: pdf
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
Code: https://github.com/cog-imperial/PartitionedFormulations_NN
13 Replies

Loading