An optimal gradient method for smooth (possibly strongly) convex minimizationDownload PDFOpen Website

2021 (modified: 18 Apr 2023)CoRR 2021Readers: Everyone
Abstract: We present an optimal gradient method for smooth strongly convex optimization. The method is optimal in the sense that its worst-case bound on the distance to an optimal point exactly matches the lower bound on the oracle complexity for the class of problems, meaning that no black-box first-order method can have a better worst-case guarantee without further assumptions on the class of problems at hand. In addition, we provide a constructive recipe for obtaining the algorithmic parameters of the method and illustrate that it can be used for deriving methods for other optimality criteria as well.
0 Replies

Loading