Improved Iteration Complexity in Black-Box Optimization Problems under Higher Order Smoothness Function Condition

Published: 20 Sept 2024, Last Modified: 20 Sept 2024ICOMP PublicationEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Black-box optimization, Higher order smoothness function, Strongly convex optimization, Maximum noise level
Abstract: This paper is devoted to the study (common in many applications) of the black-box optimization problem, where the black-box represents a gradient-free oracle $\tilde{f} = f(x) + \xi$ providing the objective function value with some stochastic noise. Assuming that the objective function is $\mu$-strongly convex, and also not just $L$-smooth, but has a higher order of smoothness ($\beta \geq 2$) we provide a novel optimization method: Zero-Order Accelerated Batched Stochastic Gradient Descent, whose theoretical analysis closes the question regarding the iteration complexity, achieving optimal estimates. Moreover, we provide a thorough analysis of the maximum noise level, and show under which condition the maximum noise level will take into account information about batch size $B$ as well as information about the smoothness order of the function $\beta$.
Submission Number: 69
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview