BiGrad: Differentiating through Bilevel Optimization ProgrammingDownload PDF

Published: 02 Dec 2021, Last Modified: 05 May 2023AAAI-22 AdvML Workshop LongPaperReaders: Everyone
Keywords: Optimization, Machine Learning, Adversarial Systems
TL;DR: We propose BiGrad, a method to estimate gradients for Bilevel Optimization problems. Bilevel programming models systems with conflicting objectives as in Advesarial models.
Abstract: Integrating mathematical programming, and in particular Bilevel Optimization Programming, within deep learning architectures has vast applications in various domains from machine learning to engineering. Bilevel programming is able to capture complex interactions when two actors have conflicting objectives. Previous approaches only consider single-level programming. In this paper, we thus propose Differentiating through Bilevel Optimization Programming (BiGrad) as approach for end-to-end learning of models that use Bilevel Programming as a layer. BiGrad has wide applicability and it can be used in modern machine learning frameworks. We focus on two classes of Bilevel Programming: continuous and combinatorial optimization problems. The framework extends existing approaches of single level optimization programming. We describe a class of gradient estimators for the combinatorial case which reduces the requirements in term of computation complexity; for the continuous variables case the gradient computation takes advantage of push-back approach (i.e. vector-jacobian product) for an efficient implementation. Experiments suggest that the proposed approach successfully extends existing single level approaches to Bilevel Programming.
2 Replies

Loading