Towards Gradient Free and Projection Free Stochastic OptimizationOpen Website

06 Apr 2021 (modified: 06 Apr 2021)OpenReview Archive Direct UploadReaders: Everyone
Abstract: This paper focuses on the problem of con- strained stochastic optimization. A zeroth order Frank-Wolfe algorithm is proposed, which in addition to the projection-free na- ture of the vanilla Frank-Wolfe algorithm makes it gradient free. Under convexity and smoothness assumption, we show that the proposed algorithm converges to the opti- mal objective function at a rate O 􏰀1/T 1/3􏰁, where T denotes the iteration count. In particular, the primal sub-optimality gap is shown to have a dimension dependence of O 􏰀d1/3􏰁, which is the best known dimension dependence among all zeroth order optimiza- tion algorithms with one directional deriva- tive per iteration. For non-convex func- tions, we obtain the Frank-Wolfe gap to be O 􏰀d1/3T −1/4􏰁. Experiments on black-box optimization setups demonstrate the efficacy of the proposed algorithm.
0 Replies

Loading