Nonlinear Optimization with GPU-Accelerated Neural Network Constraints

Published: 22 Sept 2025, Last Modified: 25 Nov 2025ScaleOPT OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Nonlinear optimization; GPU; neural network; interior point; automatic differentiation
TL;DR: A reduced-space formulation that exploits GPUs improves the performance of neural network-constrained nonlinear optimization problems.
Abstract: We propose a reduced-space formulation for optimizing over trained neural networks where the network's outputs and derivatives are evaluated on a GPU. To do this, we treat the neural network as a "gray box" where intermediate variables and constraints are not exposed to the optimization solver. Compared to the full-space formulation, in which intermediate variables and constraints *are* exposed to the optimization solver, the reduced-space formulation leads to faster solves and fewer iterations in an interior point method. We demonstrate the benefits of this method on two optimization problems: Adversarial generation for a classifier trained on MNIST images and security-constrained optimal power flow with transient feasibility enforced using a neural network surrogate.
Submission Number: 23
Loading