Zeroth-Order Negative Curvature Finding: Escaping Saddle Points without GradientsDownload PDF

Published: 31 Oct 2022, Last Modified: 04 Oct 2022NeurIPS 2022 AcceptReaders: Everyone
Keywords: Non-Convex, Optimization, Zeroth-Order, Saddle Point
Abstract: We consider escaping saddle points of nonconvex problems where only the function evaluations can be accessed. Although a variety of works have been proposed, the majority of them require either second or first-order information, and only a few of them have exploited zeroth-order methods, particularly the technique of negative curvature finding with zeroth-order methods which has been proven to be the most efficient method for escaping saddle points. To fill this gap, in this paper, we propose two zeroth-order negative curvature finding frameworks that can replace Hessian-vector product computations without increasing the iteration complexity. We apply the proposed frameworks to ZO-GD, ZO-SGD, ZO-SCSG, ZO-SPIDER and prove that these ZO algorithms can converge to $(\epsilon,\delta)$-approximate second-order stationary points with less query complexity compared with prior zeroth-order works for finding local minima.
Supplementary Material: zip
20 Replies

Loading