Gradient-based Algorithms for Pessimistic Bilevel OptimizationDownload PDF

22 Sept 2022 (modified: 13 Feb 2023)ICLR 2023 Conference Withdrawn SubmissionReaders: Everyone
Keywords: pessimistic bilevel optimization, convergence analysis, nonconvex, gradient-based method
TL;DR: We propose the first gradient-based algorithm for pessimistic bilevel optimization, provide the first convergence result with nonlinear objective functions, and validate our design and analysis through experiments on several robust learning problems.
Abstract: As a powerful framework for a variety of machine learning problems, bilevel optimization has attracted much attention. While many modern gradient-based algorithms have been devised for optimistic bilevel optimization, pessimistic bilevel optimization (PBO) is still less-explored and only studied under the linear settings. To fill this void, we investigate PBO with nonlinear inner- and outer-level objective functions in this work, by reformulating it into a single-level constrained optimization problem. In particular, two gradient-based algorithms are first proposed to solve the reformulated problem, i.e., the switching gradient method (SG-PBO) and the primal-dual method (PD-PBO). Through carefully handling the bias errors in gradient estimations resulted by the nature of bilevel optimization, we show that both SG-PBO and PD-PBO converge to the global minimum of the reformulated problem when it is strongly convex, which immediately implies the convergence to the original PBO. Moreover, we propose the proximal scheme (Prox-PBO) with the convergence guarantee for the nonconvex reformulated problem. To the best of our knowledge, this is the first work that investigates gradient-based algorithms and provides convergence analysis for PBO under non-linear settings. We further conduct experiments on an illustrative example and a robust hyperparameter learning problem, which clearly validate our algorithmic design and theoretical analysis.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Optimization (eg, convex and non-convex optimization)
4 Replies

Loading