TL;DR: We propose and analyze a parameter-free and near-optimal zeroth-order algorithm for stochastic convex optimization.
Abstract: This paper studies zeroth-order optimization for stochastic convex minimization problems. We propose a parameter-free stochastic zeroth-order method (POEM), which introduces a step-size scheme based on the distance over finite difference and an adaptive smoothing
parameter. Our theoretical analysis shows that POEM achieves near-optimal stochastic zeroth-order oracle complexity. Furthermore, numerical experiments demonstrate that POEM outperforms existing zeroth-order methods in practice.
Lay Summary: In many real-world problems, we need to find the best setting of some inputs even when we can’t directly measure how steeply the outcome is changing (no “gradient” information). This paper introduces a new black-box search strategy called POEM that learns to pick how big its steps should be—and other internal settings—automatically, without the need for manual tuning. We prove that POEM finds high-quality solutions almost as quickly as theoretically possible, and our computer tests show it consistently beats popular existing methods. In short, POEM makes efficient, hands-off optimization practical for a wide range of noisy, complex problems.
Primary Area: Theory->Optimization
Keywords: zeroth-order optimization, stochastic convex optimization, parameter-free optimization
Submission Number: 8829
Loading