On Some Versions of Subspace Optimization Methods with Inexact Gradient Information

Published: 20 Sept 2024, Last Modified: 20 Sept 2024ICOMP PublicationEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Non-convex Optimization, Inexact Gradient, Subspace Optimization
Abstract: It is well-known that accelerated gradient first order methods possess optimal complexity estimates for the class of convex smooth minimization problems. In many practical situations, it makes sense to work with inexact gradients. However, this can lead to the accumulation of corresponding inexactness in the theoretical estimates of the rate of convergence. We propose some modification of the methods for convex optimization with inexact gradient based on the subspace optimization sush as Nemirovski's Conjugate Gradients and Sequential Subspace Optimization. We research the convergence for different condition of inexactness both in gradient value and accuracy of subspace optimization problems. Besides this, we investigate generalization of this result to the class of quasar-convex (weakly-quasi-convex) functions.
Submission Number: 75
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview