Faster Discrete Convex Function Minimization with Predictions: The M-Convex Case

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: algorithms with predictions, beyond the worst-case analysis of algorithms, time complexity, combinatorial optimization, discrete convex analysis, submodular functions
TL;DR: We present a framework for accelerating M-convex function minimization with predictions, thus complementing previous research and extending the range of optimization algorithms that can benefit from predictions.
Abstract: Recent years have seen a growing interest in accelerating optimization algorithms with machine-learned predictions. Sakaue and Oki (NeurIPS 2022) have developed a general framework that warm-starts the *L-convex function minimization* method with predictions, revealing the idea's usefulness for various discrete optimization problems. In this paper, we present a framework for using predictions to accelerate *M-convex function minimization*, thus complementing previous research and extending the range of discrete optimization algorithms that can benefit from predictions. Our framework is particularly effective for an important subclass called *laminar convex minimization*, which appears in many operations research applications. Our methods can improve time complexity bounds upon the best worst-case results by using predictions and even have potential to go beyond a lower-bound result.
Supplementary Material: pdf
Submission Number: 21
Loading