Landscape Surrogate: Learning Decision Losses for Mathematical Optimization Under Partial Information

Published: 20 Jun 2023, Last Modified: 17 Sept 2023Differentiable Almost EverythingEveryoneRevisionsBibTeX
Keywords: learning surrogates, decision-focused learning, combinatorial nonlinear optimization, argmin differentiation, alternating optimization
TL;DR: Introducing LANCER, a versatile framework designed to tackle challenging optimization problems, such as those found in nonlinear combinatorial problems, decision-focused learning, etc.
Abstract: Recent works in learning-integrated optimization have shown promise in settings where the optimization problem is only partially observed or where general-purpose optimizers perform poorly without expert tuning. By learning an optimizer $\mathbf{g}$ to tackle these challenging problems with $f$ as the objective, the optimization process can be substantially accelerated by leveraging past experience. Training the optimizer can be done with supervision from known optimal solutions (not always available) or implicitly by optimizing the compound function $f\circ \mathbf{g}$, but the implicit approach is slow and challenging due to frequent calls to the optimizer and sparse gradients, particularly for combinatorial solvers. To address these challenges, we propose using a smooth and learnable Landscape Surrogate $\mathcal{M}$ instead of $f\circ \mathbf{g}$. This surrogate can be computed faster than $\mathbf{g}$, provides dense and smooth gradients during training, can generalize to unseen optimization problems, and is efficiently learned via alternating optimization. We test our approach on both synthetic problems and real-world problems, achieving comparable or superior objective values compared to state-of-the-art baselines while reducing the number of calls to $\mathbf{g}$. Notably, our approach outperforms existing methods for computationally expensive high-dimensional problems.
Submission Number: 15
Loading